Skip to content

Conversation

@o-nikolas
Copy link
Contributor

Wrap import in a try/catch to support older versions of airflow, otherwise import from task sdk

Previous to this change, I was receiving pre-commit errors when modifying executor modules:

providers/amazon/src/airflow/providers/amazon/aws/executors/aws_lambda/lambda_executor.py:46: error:
Module "airflow.utils" has no attribute "timezone"  [attr-defined]
    from airflow.utils import timezone
    ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
providers/amazon/tests/unit/amazon/aws/executors/aws_lambda/test_lambda_executor.py:34: error:
Module "airflow.utils" has no attribute "timezone"  [attr-defined]
    from airflow.utils import timezone
    ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Found 2 errors in 2 files (checked 3 source files)
exit status 1
Error 1 returned

^ Add meaningful description above
Read the Pull Request Guidelines for more information.
In case of fundamental code changes, an Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in a newsfragment file, named {pr_number}.significant.rst or {issue_number}.significant.rst, in airflow-core/newsfragments.

Wrap import in a try/catch to support older versions of airflow,
otherwise import from task sdk
@o-nikolas o-nikolas requested a review from eladkal as a code owner July 30, 2025 21:46
@boring-cyborg boring-cyborg bot added area:providers provider:amazon AWS/Amazon - related issues labels Jul 30, 2025
@o-nikolas
Copy link
Contributor Author

@ashb

I was looking at some of the changes you made in #53149, I see in some places the try/catch import from task sdk is used and in others and import from __shared. Is the try/catch import from sdk the correct approach for a provider use case like this?

@uranusjr
Copy link
Member

I believe provider code should generally import from sdk (since that’s the public API)

@o-nikolas
Copy link
Contributor Author

I believe provider code should generally import from sdk (since that’s the public API)

I agree, generally. But providers use much more internal pieces of airflow than what a user would in their Dags alone. So they're not playing by the exact same rules there. But in this case, agreed, import from sdk makes sense.

@ashb
Copy link
Member

ashb commented Jul 31, 2025

But providers use much more internal pieces of airflow than what a user would in their Dags alone

Going forward I would like that not to be the case -- and providers can only access things in the public API -- custom operator or provider or python task in DAG file should be on the same level.

@ashb
Copy link
Member

ashb commented Jul 31, 2025

Thank you for picking this up, I had forgotten to go and do this for all the providers after my PR landed 🤦🏻 (I didn't want providers and core in the same PR)

@ashb ashb merged commit f9278ac into apache:main Jul 31, 2025
77 checks passed
RoyLee1224 pushed a commit to RoyLee1224/airflow that referenced this pull request Jul 31, 2025
Wrap import in a try/catch to support older versions of airflow,
otherwise import from task sdk
ferruzzi pushed a commit to aws-mwaa/upstream-to-airflow that referenced this pull request Aug 7, 2025
Wrap import in a try/catch to support older versions of airflow,
otherwise import from task sdk
fweilun pushed a commit to fweilun/airflow that referenced this pull request Aug 11, 2025
Wrap import in a try/catch to support older versions of airflow,
otherwise import from task sdk
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area:providers provider:amazon AWS/Amazon - related issues

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants