Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
18 commits
Select commit Hold shift + click to select a range
0ecc0cd
Make Python 3.9 default for v2-10-test branch (#45599)
potiuk Jan 12, 2025
81e6215
[v2-10-test] Upgrade sphinx and related dependencies (#45563) (#45596)
shahar1 Jan 12, 2025
a17307d
Protect against missing .uv cache (#45605)
potiuk Jan 13, 2025
90db0f3
Provide package write permissions to push-ci-image-cache job (#45573)…
potiuk Jan 13, 2025
71261fe
fix: log action get the correct request body (#45546) (#45560)
pierrejeambrun Jan 13, 2025
9d66c48
fix: rm `skip_if` and `run_if` in python source (#41832) (#45680)
josix Jan 15, 2025
04d0381
Fix empty task instance for log (#45702) (#45703)
jscheffl Jan 16, 2025
322ce0c
[v2-10-test] Improve speed of tests by not creating connections at pa…
github-actions[bot] Jan 21, 2025
8017ca4
Add ready_for_review to workflow pull_request types (#45855) (#45906)
gopidesupavan Jan 22, 2025
80a1990
Remove Scarf tracking (#45865) (#45941)
kaxil Jan 22, 2025
7856983
[v2-10-test] Fix `FileTaskHandler` only read from default executor (#…
jason810496 Jan 26, 2025
a5726a5
[v2-10-test] Upgrade uv and pip (#46078)
potiuk Jan 26, 2025
a2f302d
Issue deprecation warning for plugins registering `ti_deps` (#45742)
ashb Jan 29, 2025
57adf0b
Fixed thread local _sentinel.callers defect and added test cases (#44…
utkarsharma2 Jan 30, 2025
d60df2a
[v2-10-test] Add Webserver parameters: max_form_parts, max_form_memor…
github-actions[bot] Jan 30, 2025
dcf8650
Add map_index parameter to extra links API for Airflow 2.10 (#46337)
shubhamraj-git Feb 3, 2025
37f6218
Update version to 2.10.5
utkarsharma2 Jan 28, 2025
b93c3db
Update RELEASE_NOTES.rst
utkarsharma2 Jan 28, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/actions/install-pre-commit/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ inputs:
default: "3.9"
uv-version:
description: 'uv version to use'
default: "0.5.17" # Keep this comment to allow automatic replacement of uv version
default: "0.5.24" # Keep this comment to allow automatic replacement of uv version
pre-commit-version:
description: 'pre-commit version to use'
default: "3.5.0" # Keep this comment to allow automatic replacement of pre-commit version
Expand Down
1 change: 1 addition & 0 deletions .github/workflows/ci-image-checks.yml
Original file line number Diff line number Diff line change
Expand Up @@ -142,6 +142,7 @@ jobs:
if: inputs.canary-run == 'true'
- name: "Prepare .tar file from pre-commit cache"
run: |
mkdir -p ~/.cache/uv # until we are Python 3.9+ we do not have .uv in pre-commits
tar -C ~ -czf /tmp/cache-pre-commit.tar.gz .cache/pre-commit .cache/uv
shell: bash
if: inputs.canary-run == 'true'
Expand Down
1 change: 1 addition & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,7 @@ on: # yamllint disable-line rule:truthy
- v[0-9]+-[0-9]+-test
- v[0-9]+-[0-9]+-stable
- providers-[a-z]+-?[a-z]*/v[0-9]+-[0-9]+
types: [opened, reopened, synchronize, ready_for_review]
workflow_dispatch:
permissions:
# All other permissions are set to none by default
Expand Down
8 changes: 6 additions & 2 deletions .github/workflows/push-image-cache.yml
Original file line number Diff line number Diff line change
Expand Up @@ -80,8 +80,6 @@ on: # yamllint disable-line rule:truthy
description: "Disable airflow repo cache read from main."
required: true
type: string
permissions:
contents: read
jobs:
push-ci-image-cache:
name: "Push CI ${{ inputs.cache-type }}:${{ matrix.python }} image cache "
Expand All @@ -90,6 +88,9 @@ jobs:
# instead of an array of strings.
# yamllint disable-line rule:line-length
runs-on: ${{ (inputs.platform == 'linux/amd64') && fromJSON(inputs.runs-on-as-json-public) || fromJSON(inputs.runs-on-as-json-self-hosted) }}
permissions:
contents: read
packages: write
strategy:
fail-fast: false
matrix:
Expand Down Expand Up @@ -163,6 +164,9 @@ jobs:
# instead of an array of strings.
# yamllint disable-line rule:line-length
runs-on: ${{ (inputs.platform == 'linux/amd64') && fromJSON(inputs.runs-on-as-json-public) || fromJSON(inputs.runs-on-as-json-self-hosted) }}
permissions:
contents: read
packages: write
strategy:
fail-fast: false
matrix:
Expand Down
4 changes: 2 additions & 2 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -53,9 +53,9 @@ ARG PYTHON_BASE_IMAGE="python:3.8-slim-bookworm"
# You can swap comments between those two args to test pip from the main version
# When you attempt to test if the version of `pip` from specified branch works for our builds
# Also use `force pip` label on your PR to swap all places we use `uv` to `pip`
ARG AIRFLOW_PIP_VERSION=24.3.1
ARG AIRFLOW_PIP_VERSION=25.0
# ARG AIRFLOW_PIP_VERSION="git+https://github.com/pypa/pip.git@main"
ARG AIRFLOW_UV_VERSION=0.5.17
ARG AIRFLOW_UV_VERSION=0.5.24
ARG AIRFLOW_USE_UV="false"
ARG UV_HTTP_TIMEOUT="300"
ARG AIRFLOW_IMAGE_REPOSITORY="https://github.com/apache/airflow"
Expand Down
4 changes: 2 additions & 2 deletions Dockerfile.ci
Original file line number Diff line number Diff line change
Expand Up @@ -1249,9 +1249,9 @@ COPY --from=scripts common.sh install_packaging_tools.sh install_additional_depe
# You can swap comments between those two args to test pip from the main version
# When you attempt to test if the version of `pip` from specified branch works for our builds
# Also use `force pip` label on your PR to swap all places we use `uv` to `pip`
ARG AIRFLOW_PIP_VERSION=24.3.1
ARG AIRFLOW_PIP_VERSION=25.0
# ARG AIRFLOW_PIP_VERSION="git+https://github.com/pypa/pip.git@main"
ARG AIRFLOW_UV_VERSION=0.5.17
ARG AIRFLOW_UV_VERSION=0.5.24
# TODO(potiuk): automate with upgrade check (possibly)
ARG AIRFLOW_PRE_COMMIT_VERSION="3.5.0"

Expand Down
5 changes: 1 addition & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -288,7 +288,7 @@ Apache Airflow version life cycle:

| Version | Current Patch/Minor | State | First Release | Limited Support | EOL/Terminated |
|-----------|-----------------------|-----------|-----------------|-------------------|------------------|
| 2 | 2.10.4 | Supported | Dec 17, 2020 | TBD | TBD |
| 2 | 2.10.5 | Supported | Dec 17, 2020 | TBD | TBD |
| 1.10 | 1.10.15 | EOL | Aug 27, 2018 | Dec 17, 2020 | June 17, 2021 |
| 1.9 | 1.9.0 | EOL | Jan 03, 2018 | Aug 27, 2018 | Aug 27, 2018 |
| 1.8 | 1.8.2 | EOL | Mar 19, 2017 | Jan 03, 2018 | Jan 03, 2018 |
Expand Down Expand Up @@ -534,6 +534,3 @@ The CI infrastructure for Apache Airflow has been sponsored by:

<a href="https://astronomer.io"><img src="https://assets2.astronomer.io/logos/logoForLIGHTbackground.png" alt="astronomer.io" width="250px"></a>
<a href="https://aws.amazon.com/opensource/"><img src="docs/integration-logos/aws/AWS-Cloud-alt_light-bg@4x.png" alt="AWS OpenSource" width="130px"></a>

<!-- telemetry/analytics pixel: -->
<img referrerpolicy="no-referrer-when-downgrade" src="https://static.scarf.sh/a.png?x-pxid=1b5a5e3c-da81-42f5-befa-42d836bf1b54" alt="Tracking Pixel" />
64 changes: 58 additions & 6 deletions RELEASE_NOTES.rst
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,59 @@

.. towncrier release notes start

Airflow 2.10.5 (2025-02-06)
---------------------------

Significant Changes
^^^^^^^^^^^^^^^^^^^

Ensure teardown tasks are executed when DAG run is set to failed (#45530)
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""

Previously when a DAG run was manually set to "failed" or to "success" state the terminal state was set to all tasks.
But this was a gap for cases when setup- and teardown tasks were defined: If teardown was used to clean-up infrastructure
or other resources, they were also skipped and thus resources could stay allocated.

As of now when setup tasks had been executed before and the DAG is manually set to "failed" or "success" then teardown
tasks are executed. Teardown tasks are skipped if the setup was also skipped.

As a side effect this means if the DAG contains teardown tasks, then the manual marking of DAG as "failed" or "success"
will need to keep the DAG in running state to ensure that teardown tasks will be scheduled. They would not be scheduled
if the DAG is directly set to "failed" or "success".


Bug Fixes
"""""""""

- Prevent using ``trigger_rule=TriggerRule.ALWAYS`` in a task-generated mapping within bare tasks (#44751)
- Fix ShortCircuitOperator mapped tasks (#44912)
- Fix premature evaluation of tasks with certain trigger rules (e.g. ``ONE_DONE``) in a mapped task group (#44937)
- Fix task_id validation in BaseOperator (#44938) (#44938)
- Allow fetching XCom with forward slash from the API and escape it in the UI (#45134)
- Fix ``FileTaskHandler`` only read from default executor (#46000)
- Fix empty task instance for log (#45702) (#45703)
- Remove ``skip_if`` and ``run_if`` decorators before TaskFlow virtualenv tasks are run (#41832) (#45680)
- Fix request body for json requests in event log (#45546) (#45560)
- Ensure teardown tasks are executed when DAG run is set to failed (#45530) (#45581)
- Do not update DR on TI update after task execution (#45348)
- Fix object and array DAG params that have a None default (#45313) (#45315)
- Fix endless sensor rescheduling (#45224) (#45250)
- Evaluate None in SQLAlchemy's extended JSON type decorator (#45119) (#45120)
- Allow dynamic tasks to be filtered by ``rendered_map_index`` (#45109) (#45122)
- Handle relative paths when sanitizing URLs (#41995) (#45080)
- Set Autocomplete Off on Login Form (#44929) (#44940)
- Add Webserver parameters ``max_form_parts``, ``max_form_memory_size`` (#46243) (#45749)
- Fixed accessing thread local variable in BaseOperators ``execute`` safeguard mechanism (#44646) (#46280)
- Add map_index parameter to extra links API (#46337)


Miscellaneous
"""""""""""""

- Add traceback log output when SIGTERMs was sent (#44880) (#45077)
- Removed the ability for Operators to specify their own "scheduling deps" (#45713) (#45742)
- Deprecate ``conf`` from Task Context (#44993)

Airflow 2.10.4 (2024-12-09)
---------------------------

Expand Down Expand Up @@ -223,6 +276,11 @@ Airflow 2.10.0 (2024-08-15)
Significant Changes
^^^^^^^^^^^^^^^^^^^

Scarf based telemetry: Airflow now collect telemetry data (#39510)
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Airflow integrates Scarf to collect basic usage data during operation. Deployments can opt-out of data collection by
setting the ``[usage_data_collection]enabled`` option to ``False``, or the ``SCARF_ANALYTICS=false`` environment variable.

Datasets no longer trigger inactive DAGs (#38891)
"""""""""""""""""""""""""""""""""""""""""""""""""

Expand Down Expand Up @@ -271,12 +329,6 @@ Previously known as hybrid executors, this new feature allows Airflow to use mul
to use a specific executor that suits its needs best. A single DAG can contain tasks all using different executors. Please see the Airflow documentation for
more details. Note: This feature is still experimental. See `documentation on Executor <https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/executor/index.html#using-multiple-executors-concurrently>`_ for a more detailed description.

Scarf based telemetry: Does Airflow collect any telemetry data? (#39510)
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Airflow integrates Scarf to collect basic usage data during operation. Deployments can opt-out of data collection by setting the ``[usage_data_collection]enabled`` option to False, or the SCARF_ANALYTICS=false environment variable.
See `FAQ on this <https://airflow.apache.org/docs/apache-airflow/stable/faq.html#does-airflow-collect-any-telemetry-data>`_ for more information.


New Features
""""""""""""
- AIP-61 Hybrid Execution (`AIP-61 <https://github.com/apache/airflow/pulls?q=is%3Apr+label%3Aarea%3Ahybrid-executors+is%3Aclosed+milestone%3A%22Airflow+2.10.0%22>`_)
Expand Down
2 changes: 1 addition & 1 deletion airflow/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
# under the License.
from __future__ import annotations

__version__ = "2.10.4"
__version__ = "2.10.5"

import os
import sys
Expand Down
2 changes: 2 additions & 0 deletions airflow/api_connexion/endpoints/extra_link_endpoint.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,6 +42,7 @@ def get_extra_links(
dag_id: str,
dag_run_id: str,
task_id: str,
map_index: int = -1,
session: Session = NEW_SESSION,
) -> APIResponse:
"""Get extra links for task instance."""
Expand All @@ -62,6 +63,7 @@ def get_extra_links(
TaskInstance.dag_id == dag_id,
TaskInstance.run_id == dag_run_id,
TaskInstance.task_id == task_id,
TaskInstance.map_index == map_index,
)
)

Expand Down
3 changes: 2 additions & 1 deletion airflow/api_connexion/openapi/v1.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -231,7 +231,7 @@ info:
This means that the server encountered an unexpected condition that prevented it from
fulfilling the request.

version: "2.10.4"
version: "2.10.5"
license:
name: Apache 2.0
url: http://www.apache.org/licenses/LICENSE-2.0.html
Expand Down Expand Up @@ -2062,6 +2062,7 @@ paths:
- $ref: "#/components/parameters/DAGID"
- $ref: "#/components/parameters/DAGRunID"
- $ref: "#/components/parameters/TaskID"
- $ref: "#/components/parameters/FilterMapIndex"

get:
summary: List extra links
Expand Down
6 changes: 3 additions & 3 deletions airflow/api_connexion/schemas/task_schema.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,14 +49,14 @@ class TaskSchema(Schema):
)
depends_on_past = fields.Boolean(dump_only=True)
wait_for_downstream = fields.Boolean(dump_only=True)
retries = fields.Number(dump_only=True)
retries = fields.Number(dump_only=True) # type: ignore[var-annotated]
queue = fields.String(dump_only=True)
pool = fields.String(dump_only=True)
pool_slots = fields.Number(dump_only=True)
pool_slots = fields.Number(dump_only=True) # type: ignore[var-annotated]
execution_timeout = fields.Nested(TimeDeltaSchema, dump_only=True)
retry_delay = fields.Nested(TimeDeltaSchema, dump_only=True)
retry_exponential_backoff = fields.Boolean(dump_only=True)
priority_weight = fields.Number(dump_only=True)
priority_weight = fields.Number(dump_only=True) # type: ignore[var-annotated]
weight_rule = WeightRuleField(dump_only=True)
ui_color = ColorField(dump_only=True)
ui_fgcolor = ColorField(dump_only=True)
Expand Down
3 changes: 0 additions & 3 deletions airflow/cli/commands/scheduler_command.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,6 @@
from airflow.utils.cli import process_subdir
from airflow.utils.providers_configuration_loader import providers_configuration_loaded
from airflow.utils.scheduler_health import serve_health_check
from airflow.utils.usage_data_collection import usage_data_collection

log = logging.getLogger(__name__)

Expand All @@ -54,8 +53,6 @@ def scheduler(args: Namespace):
"""Start Airflow Scheduler."""
print(settings.HEADER)

usage_data_collection()

run_command_with_daemon_option(
args=args,
process_name="scheduler",
Expand Down
38 changes: 16 additions & 22 deletions airflow/config_templates/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2120,6 +2120,22 @@ webserver:
type: boolean
example: ~
default: "False"
max_form_memory_size:
description: |
The maximum size in bytes any non-file form field may be in a multipart/form-data body.
If this limit is exceeded, a 413 RequestEntityTooLarge error is raised by webserver.
version_added: 2.10.5
type: integer
example: ~
default: "500000"
max_form_parts:
description: |
The maximum number of fields that may be present in a multipart/form-data body.
If this limit is exceeded, a 413 RequestEntityTooLarge error is raised by webserver.
version_added: 2.10.5
type: integer
example: ~
default: "1000"
email:
description: |
Configuration email backend and whether to
Expand Down Expand Up @@ -2735,25 +2751,3 @@ sensors:
type: float
example: ~
default: "604800"
usage_data_collection:
description: |
Airflow integrates `Scarf <https://about.scarf.sh/>`__ to collect basic platform and usage data
during operation. This data assists Airflow maintainers in better understanding how Airflow is used.
Insights gained from this telemetry are critical for prioritizing patches, minor releases, and
security fixes. Additionally, this information supports key decisions related to the development road map.
Check the FAQ doc for more information on what data is collected.

Deployments can opt-out of analytics by setting the ``enabled`` option
to ``False``, or the ``SCARF_ANALYTICS=false`` environment variable.
Individual users can easily opt-out of analytics in various ways documented in the
`Scarf Do Not Track docs <https://docs.scarf.sh/gateway/#do-not-track>`__.

options:
enabled:
description: |
Enable or disable usage data collection and sending.
version_added: 2.10.0
type: boolean
example: ~
default: "True"
see_also: ":ref:`Usage data collection FAQ <usage-data-collection>`"
4 changes: 4 additions & 0 deletions airflow/executors/executor_loader.py
Original file line number Diff line number Diff line change
Expand Up @@ -201,6 +201,10 @@ def init_executors(cls) -> list[BaseExecutor]:
@classmethod
def lookup_executor_name_by_str(cls, executor_name_str: str) -> ExecutorName:
# lookup the executor by alias first, if not check if we're given a module path
if not _classname_to_executors or not _module_to_executors or not _alias_to_executors:
# if we haven't loaded the executors yet, such as directly calling load_executor
cls._get_executor_names()

if executor_name := _alias_to_executors.get(executor_name_str):
return executor_name
elif executor_name := _module_to_executors.get(executor_name_str):
Expand Down
2 changes: 2 additions & 0 deletions airflow/models/baseoperator.py
Original file line number Diff line number Diff line change
Expand Up @@ -410,6 +410,8 @@ def wrapper(self, *args, **kwargs):
sentinel = kwargs.pop(sentinel_key, None)

if sentinel:
if not getattr(cls._sentinel, "callers", None):
cls._sentinel.callers = {}
cls._sentinel.callers[sentinel_key] = sentinel
else:
sentinel = cls._sentinel.callers.pop(f"{func.__qualname__.split('.')[0]}__sentinel", None)
Expand Down
12 changes: 12 additions & 0 deletions airflow/plugins_manager.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@
import os
import sys
import types
import warnings
from pathlib import Path
from typing import TYPE_CHECKING, Any, Iterable

Expand Down Expand Up @@ -431,6 +432,17 @@ def initialize_ti_deps_plugins():
registered_ti_dep_classes = {}

for plugin in plugins:
if not plugin.ti_deps:
continue

from airflow.exceptions import RemovedInAirflow3Warning

warnings.warn(
"Using custom `ti_deps` on operators has been removed in Airflow 3.0",
RemovedInAirflow3Warning,
stacklevel=1,
)

registered_ti_dep_classes.update(
{qualname(ti_dep.__class__): ti_dep.__class__ for ti_dep in plugin.ti_deps}
)
Expand Down
2 changes: 1 addition & 1 deletion airflow/providers/MANAGING_PROVIDERS_LIFECYCLE.rst
Original file line number Diff line number Diff line change
Expand Up @@ -454,7 +454,7 @@ If you have pre-commit installed, pre-commit will be run automatically on commit
manually after commit, you can run it via ``breeze static-checks --last-commit`` some of the tests might fail
because suspension of the provider might cause changes in the dependencies, so if you see errors about
missing dependencies imports, non-usable classes etc., you will need to build the CI image locally
via ``breeze build-image --python 3.8 --upgrade-to-newer-dependencies`` after the first pre-commit run
via ``breeze build-image --python 3.9 --upgrade-to-newer-dependencies`` after the first pre-commit run
and then run the static checks again.

If you want to be absolutely sure to run all static checks you can always do this via
Expand Down
9 changes: 3 additions & 6 deletions airflow/providers/amazon/aws/transfers/sql_to_s3.py
Original file line number Diff line number Diff line change
Expand Up @@ -223,12 +223,9 @@ def _partition_dataframe(self, df: pd.DataFrame) -> Iterable[tuple[str, pd.DataF
for group_label in (grouped_df := df.groupby(**self.groupby_kwargs)).groups:
yield (
cast(str, group_label),
cast(
"pd.DataFrame",
grouped_df.get_group(group_label)
.drop(random_column_name, axis=1, errors="ignore")
.reset_index(drop=True),
),
grouped_df.get_group(group_label)
.drop(random_column_name, axis=1, errors="ignore")
.reset_index(drop=True),
)

def _get_hook(self) -> DbApiHook:
Expand Down
4 changes: 2 additions & 2 deletions airflow/reproducible_build.yaml
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
release-notes-hash: 7be47e2ddbbe1bfbd0d3f572d2b7800a
source-date-epoch: 1736532824
release-notes-hash: 8e5657e541a0bf44f777a4ec3ee442e3
source-date-epoch: 1738582969
4 changes: 2 additions & 2 deletions airflow/serialization/serializers/timezone.py
Original file line number Diff line number Diff line change
Expand Up @@ -87,9 +87,9 @@ def deserialize(classname: str, version: int, data: object) -> Any:
try:
from zoneinfo import ZoneInfo
except ImportError:
from backports.zoneinfo import ZoneInfo
from backports.zoneinfo import ZoneInfo # type: ignore[no-redef]

return ZoneInfo(data)
return ZoneInfo(data) # type: ignore[arg-type]

return parse_timezone(data)

Expand Down
Loading
Loading