From 2a34315462b715506bc163ffb00de89ef6747e6f Mon Sep 17 00:00:00 2001 From: Utkarsh Sharma Date: Fri, 1 Nov 2024 20:01:27 +0530 Subject: [PATCH 01/44] Allow Utkarsh to run dockerhub image release job (#43588) (#43589) (cherry picked from commit c4a0461dd362e80b82424b8cc93e71392de8e7e5) Co-authored-by: Jarek Potiuk --- .github/workflows/release_dockerhub_image.yml | 1 + 1 file changed, 1 insertion(+) diff --git a/.github/workflows/release_dockerhub_image.yml b/.github/workflows/release_dockerhub_image.yml index 100e850a6fd84..5ce1585131f76 100644 --- a/.github/workflows/release_dockerhub_image.yml +++ b/.github/workflows/release_dockerhub_image.yml @@ -85,6 +85,7 @@ jobs: "kaxil", "pierrejeambrun", "potiuk", + "utkarsharma2" ]'), github.event.sender.login) steps: - name: "Cleanup repo" From 590a44deb504fea942cf7cc8c7a3525b2ef28c16 Mon Sep 17 00:00:00 2001 From: Jens Scheffler <95105677+jscheffl@users.noreply.github.com> Date: Fri, 1 Nov 2024 19:21:11 +0100 Subject: [PATCH 02/44] Fix Try Selector in Mapped Tasks also on Index 0 (#43590) Backport (#43591) * Fix Try Selector in Mapped Tasks also on Index 0 (cherry picked from commit 1c3d555cc92104a34b63bfaa404ba2474ed945d6) * Review Feedback, direct commit Co-authored-by: Brent Bovenzi --------- Co-authored-by: Brent Bovenzi --- airflow/www/static/js/api/useTIHistory.ts | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/airflow/www/static/js/api/useTIHistory.ts b/airflow/www/static/js/api/useTIHistory.ts index d90ce91f030db..1d1ee1d40f586 100644 --- a/airflow/www/static/js/api/useTIHistory.ts +++ b/airflow/www/static/js/api/useTIHistory.ts @@ -48,7 +48,7 @@ export default function useTIHistory({ .replace("_DAG_RUN_ID_", dagRunId) .replace("_TASK_ID_", taskId); - if (mapIndex && mapIndex > -1) { + if (mapIndex !== undefined && mapIndex > -1) { tiHistoryUrl = tiHistoryUrl.replace("/tries", `/${mapIndex}/tries`); } From 90a6f3f3775347c4de6968fd33b2831614a08fd5 Mon Sep 17 00:00:00 2001 From: Jarek Potiuk Date: Mon, 4 Nov 2024 09:56:27 +0100 Subject: [PATCH 03/44] Complete automation of version replacement pre-commit for pip and uv (#43205) (#43623) The scripts to update pip and uv version were not complete - they did not replace a few of our scripts and documentation. This was especially troublesome for doc replacement, because updating versions manually led to misalignments of tables in markdown. Lack of completeness of the upgrade caused #43197 and #43135 manual PRs to bump all references. Also an earlier upgrade caused the markdown table to be broken - with UV row table offset by 1. This PR fixes it: * all the scripts and docs are updated now * when markdown is updated, the table structure is not broken (cherry picked from commit 7ede73c85a3e5815b061f9b520e999cd4b5efd52) --- dev/breeze/doc/ci/02_images.md | 63 ++++++++-------- scripts/ci/pre_commit/update_installers.py | 86 ++++++++++++++++++++-- 2 files changed, 113 insertions(+), 36 deletions(-) diff --git a/dev/breeze/doc/ci/02_images.md b/dev/breeze/doc/ci/02_images.md index 19c58ebc2d2d9..f9ea4faaee7c0 100644 --- a/dev/breeze/doc/ci/02_images.md +++ b/dev/breeze/doc/ci/02_images.md @@ -421,36 +421,39 @@ DOCKER_BUILDKIT=1 docker build . -f Dockerfile.ci \ The following build arguments (`--build-arg` in docker build command) can be used for CI images: -| Build argument | Default value | Description | -|-----------------------------------|-------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------| -| `PYTHON_BASE_IMAGE` | `python:3.8-slim-bookworm` | Base Python image | -| `PYTHON_MAJOR_MINOR_VERSION` | `3.8` | major/minor version of Python (should match base image) | -| `DEPENDENCIES_EPOCH_NUMBER` | `2` | increasing this number will reinstall all apt dependencies | -| `ADDITIONAL_PIP_INSTALL_FLAGS` | | additional `pip` flags passed to the installation commands (except when reinstalling `pip` itself) | -| `PIP_NO_CACHE_DIR` | `true` | if true, then no pip cache will be stored | -| `UV_NO_CACHE` | `true` | if true, then no uv cache will be stored | -| `HOME` | `/root` | Home directory of the root user (CI image has root user as default) | -| `AIRFLOW_HOME` | `/root/airflow` | Airflow's HOME (that's where logs and sqlite databases are stored) | -| `AIRFLOW_SOURCES` | `/opt/airflow` | Mounted sources of Airflow | -| `AIRFLOW_REPO` | `apache/airflow` | the repository from which PIP dependencies are pre-installed | -| `AIRFLOW_BRANCH` | `main` | the branch from which PIP dependencies are pre-installed | -| `AIRFLOW_CI_BUILD_EPOCH` | `1` | increasing this value will reinstall PIP dependencies from the repository from scratch | -| `AIRFLOW_CONSTRAINTS_LOCATION` | | If not empty, it will override the source of the constraints with the specified URL or file. | -| `AIRFLOW_CONSTRAINTS_REFERENCE` | | reference (branch or tag) from GitHub repository from which constraints are used. By default it is set to `constraints-main` but can be `constraints-2-X`. | -| `AIRFLOW_EXTRAS` | `all` | extras to install | -| `UPGRADE_INVALIDATION_STRING` | | If set to any random value the dependencies are upgraded to newer versions. In CI it is set to build id. | -| `AIRFLOW_PRE_CACHED_PIP_PACKAGES` | `true` | Allows to pre-cache airflow PIP packages from the GitHub of Apache Airflow This allows to optimize iterations for Image builds and speeds up CI jobs. | -| `ADDITIONAL_AIRFLOW_EXTRAS` | | additional extras to install | -| `ADDITIONAL_PYTHON_DEPS` | | additional Python dependencies to install | -| `DEV_APT_COMMAND` | | Dev apt command executed before dev deps are installed in the first part of image | -| `ADDITIONAL_DEV_APT_COMMAND` | | Additional Dev apt command executed before dev dep are installed in the first part of the image | -| `DEV_APT_DEPS` | Empty - install default dependencies (see `install_os_dependencies.sh`) | Dev APT dependencies installed in the first part of the image | -| `ADDITIONAL_DEV_APT_DEPS` | | Additional apt dev dependencies installed in the first part of the image | -| `ADDITIONAL_DEV_APT_ENV` | | Additional env variables defined when installing dev deps | -| `AIRFLOW_PIP_VERSION` | `24.0` | PIP version used. | -| `AIRFLOW_UV_VERSION` | `0.1.10` | UV version used. | -| `AIRFLOW_USE_UV` | `true` | Whether to use UV for installation. | -| `PIP_PROGRESS_BAR` | `on` | Progress bar for PIP installation | +| Build argument | Default value | Description | +|-----------------------------------|----------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------| +| `PYTHON_BASE_IMAGE` | `python:3.8-slim-bookworm` | Base Python image | +| `PYTHON_MAJOR_MINOR_VERSION` | `3.8` | major/minor version of Python (should match base image) | +| `DEPENDENCIES_EPOCH_NUMBER` | `2` | increasing this number will reinstall all apt dependencies | +| `ADDITIONAL_PIP_INSTALL_FLAGS` | | additional `pip` flags passed to the installation commands (except when reinstalling `pip` itself) | +| `PIP_NO_CACHE_DIR` | `true` | if true, then no pip cache will be stored | +| `UV_NO_CACHE` | `true` | if true, then no uv cache will be stored | +| `HOME` | `/root` | Home directory of the root user (CI image has root user as default) | +| `AIRFLOW_HOME` | `/root/airflow` | Airflow's HOME (that's where logs and sqlite databases are stored) | +| `AIRFLOW_SOURCES` | `/opt/airflow` | Mounted sources of Airflow | +| `AIRFLOW_REPO` | `apache/airflow` | the repository from which PIP dependencies are pre-installed | +| `AIRFLOW_BRANCH` | `main` | the branch from which PIP dependencies are pre-installed | +| `AIRFLOW_CI_BUILD_EPOCH` | `1` | increasing this value will reinstall PIP dependencies from the repository from scratch | +| `AIRFLOW_CONSTRAINTS_LOCATION` | | If not empty, it will override the source of the constraints with the specified URL or file. | +| `AIRFLOW_CONSTRAINTS_REFERENCE` | | reference (branch or tag) from GitHub repository from which constraints are used. By default it is set to `constraints-main` but can be `constraints-2-X`. | +| `AIRFLOW_EXTRAS` | `all` | extras to install | +| `UPGRADE_INVALIDATION_STRING` | | If set to any random value the dependencies are upgraded to newer versions. In CI it is set to build id. | +| `AIRFLOW_PRE_CACHED_PIP_PACKAGES` | `true` | Allows to pre-cache airflow PIP packages from the GitHub of Apache Airflow This allows to optimize iterations for Image builds and speeds up CI jobs. | +| `ADDITIONAL_AIRFLOW_EXTRAS` | | additional extras to install | +| `ADDITIONAL_PYTHON_DEPS` | | additional Python dependencies to install | +| `DEV_APT_COMMAND` | | Dev apt command executed before dev deps are installed in the first part of image | +| `ADDITIONAL_DEV_APT_COMMAND` | | Additional Dev apt command executed before dev dep are installed in the first part of the image | +| `DEV_APT_DEPS` | | Dev APT dependencies installed in the first part of the image | +| `ADDITIONAL_DEV_APT_DEPS` | | Additional apt dev dependencies installed in the first part of the image | +| `ADDITIONAL_DEV_APT_ENV` | | Additional env variables defined when installing dev deps | +| `AIRFLOW_PIP_VERSION` | `24.0` | PIP version used. | +| `AIRFLOW_UV_VERSION` | `0.1.10` | UV version used. | +| `AIRFLOW_USE_UV` | `true` | Whether to use UV for installation. | +| `PIP_PROGRESS_BAR` | `on` | Progress bar for PIP installation | + + +The" Here are some examples of how CI images can built manually. CI is always built from local sources. diff --git a/scripts/ci/pre_commit/update_installers.py b/scripts/ci/pre_commit/update_installers.py index 1cbd38c8333a2..a90e07d38c9f3 100755 --- a/scripts/ci/pre_commit/update_installers.py +++ b/scripts/ci/pre_commit/update_installers.py @@ -30,8 +30,22 @@ FILES_TO_UPDATE = [ AIRFLOW_SOURCES_ROOT_PATH / "Dockerfile", AIRFLOW_SOURCES_ROOT_PATH / "Dockerfile.ci", + AIRFLOW_SOURCES_ROOT_PATH / "scripts" / "ci" / "install_breeze.sh", AIRFLOW_SOURCES_ROOT_PATH / "scripts" / "docker" / "common.sh", AIRFLOW_SOURCES_ROOT_PATH / "pyproject.toml", + AIRFLOW_SOURCES_ROOT_PATH / "dev" / "breeze" / "src" / "airflow_breeze" / "global_constants.py", + AIRFLOW_SOURCES_ROOT_PATH + / "dev" + / "breeze" + / "src" + / "airflow_breeze" + / "commands" + / "release_management_commands.py", +] + + +DOC_FILES_TO_UPDATE: list[Path] = [ + AIRFLOW_SOURCES_ROOT_PATH / "dev/" / "breeze" / "doc" / "ci" / "02_images.md" ] @@ -43,13 +57,39 @@ def get_latest_pypi_version(package_name: str) -> str: return latest_version -PIP_PATTERN = re.compile(r"AIRFLOW_PIP_VERSION=[0-9.]+") -UV_PATTERN = re.compile(r"AIRFLOW_UV_VERSION=[0-9.]+") -UV_GREATER_PATTERN = re.compile(r'"uv>=[0-9]+[0-9.]+"') +AIRFLOW_PIP_PATTERN = re.compile(r"(AIRFLOW_PIP_VERSION=)([0-9.]+)") +AIRFLOW_PIP_QUOTED_PATTERN = re.compile(r"(AIRFLOW_PIP_VERSION = )(\"[0-9.]+\")") +PIP_QUOTED_PATTERN = re.compile(r"(PIP_VERSION = )(\"[0-9.]+\")") +AIRFLOW_PIP_DOC_PATTERN = re.compile(r"(\| *`AIRFLOW_PIP_VERSION` *\| *)(`[0-9.]+`)( *\|)") +AIRFLOW_PIP_UPGRADE_PATTERN = re.compile(r"(python -m pip install --upgrade pip==)([0-9.]+)") + +AIRFLOW_UV_PATTERN = re.compile(r"(AIRFLOW_UV_VERSION=)([0-9.]+)") +AIRFLOW_UV_QUOTED_PATTERN = re.compile(r"(AIRFLOW_UV_VERSION = )(\"[0-9.]+\")") +AIRFLOW_UV_DOC_PATTERN = re.compile(r"(\| *`AIRFLOW_UV_VERSION` *\| *)(`[0-9.]+`)( *\|)") +UV_GREATER_PATTERN = re.compile(r'"(uv>=)([0-9]+)"') UPGRADE_UV: bool = os.environ.get("UPGRADE_UV", "true").lower() == "true" UPGRADE_PIP: bool = os.environ.get("UPGRADE_PIP", "true").lower() == "true" + +def replace_group_2_while_keeping_total_length(pattern: re.Pattern[str], replacement: str, text: str) -> str: + def replacer(match): + original_length = len(match.group(2)) + padding = "" + if len(match.groups()) > 2: + padding = match.group(3) + new_length = len(replacement) + diff = new_length - original_length + if diff <= 0: + padding = " " * -diff + padding + else: + padding = padding[diff:] + padded_replacement = match.group(1) + replacement + padding + return padded_replacement.strip() + + return re.sub(pattern, replacer, text) + + if __name__ == "__main__": pip_version = get_latest_pypi_version("pip") console.print(f"[bright_blue]Latest pip version: {pip_version}") @@ -62,10 +102,44 @@ def get_latest_pypi_version(package_name: str) -> str: file_content = file.read_text() new_content = file_content if UPGRADE_PIP: - new_content = re.sub(PIP_PATTERN, f"AIRFLOW_PIP_VERSION={pip_version}", new_content, re.MULTILINE) + new_content = replace_group_2_while_keeping_total_length( + AIRFLOW_PIP_PATTERN, pip_version, new_content + ) + new_content = replace_group_2_while_keeping_total_length( + AIRFLOW_PIP_UPGRADE_PATTERN, pip_version, new_content + ) + new_content = replace_group_2_while_keeping_total_length( + AIRFLOW_PIP_QUOTED_PATTERN, f'"{pip_version}"', new_content + ) + new_content = replace_group_2_while_keeping_total_length( + PIP_QUOTED_PATTERN, f'"{pip_version}"', new_content + ) + if UPGRADE_UV: + new_content = replace_group_2_while_keeping_total_length( + AIRFLOW_UV_PATTERN, uv_version, new_content + ) + new_content = replace_group_2_while_keeping_total_length( + AIRFLOW_UV_QUOTED_PATTERN, f'"{uv_version}"', new_content + ) + new_content = replace_group_2_while_keeping_total_length( + UV_GREATER_PATTERN, uv_version, new_content + ) + if new_content != file_content: + file.write_text(new_content) + console.print(f"[bright_blue]Updated {file}") + changed = True + for file in DOC_FILES_TO_UPDATE: + console.print(f"[bright_blue]Updating {file}") + file_content = file.read_text() + new_content = file_content + if UPGRADE_PIP: + new_content = replace_group_2_while_keeping_total_length( + AIRFLOW_PIP_DOC_PATTERN, f"`{pip_version}`", new_content + ) if UPGRADE_UV: - new_content = re.sub(UV_PATTERN, f"AIRFLOW_UV_VERSION={uv_version}", new_content, re.MULTILINE) - new_content = re.sub(UV_GREATER_PATTERN, f'"uv>={uv_version}"', new_content, re.MULTILINE) + new_content = replace_group_2_while_keeping_total_length( + AIRFLOW_UV_DOC_PATTERN, f"`{uv_version}`", new_content + ) if new_content != file_content: file.write_text(new_content) console.print(f"[bright_blue]Updated {file}") From 9425ae0c891582562aecc22e0c8afaa3879c1a5b Mon Sep 17 00:00:00 2001 From: Jarek Potiuk Date: Mon, 4 Nov 2024 09:59:07 +0100 Subject: [PATCH 04/44] Remove root warning in image used to build packages in CI (#43597) (#43625) (cherry picked from commit 681c59a27c1e0414bf1843c569fad9d0ec407456) --- .../src/airflow_breeze/commands/release_management_commands.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/dev/breeze/src/airflow_breeze/commands/release_management_commands.py b/dev/breeze/src/airflow_breeze/commands/release_management_commands.py index 18b5bd539f359..33f3dd338e4d9 100644 --- a/dev/breeze/src/airflow_breeze/commands/release_management_commands.py +++ b/dev/breeze/src/airflow_breeze/commands/release_management_commands.py @@ -240,7 +240,7 @@ class VersionedFile(NamedTuple): AIRFLOW_BUILD_DOCKERFILE = f""" FROM python:{DEFAULT_PYTHON_MAJOR_MINOR_VERSION}-slim-{ALLOWED_DEBIAN_VERSIONS[0]} RUN apt-get update && apt-get install -y --no-install-recommends git -RUN pip install pip=={AIRFLOW_PIP_VERSION} hatch=={HATCH_VERSION} pyyaml=={PYYAML_VERSION}\ +RUN pip install --root-user-action ignore pip=={AIRFLOW_PIP_VERSION} hatch=={HATCH_VERSION} pyyaml=={PYYAML_VERSION}\ gitpython=={GITPYTHON_VERSION} rich=={RICH_VERSION} pre-commit=={PRE_COMMIT_VERSION} COPY . /opt/airflow """ From f593d8b07386320a0eb9c1a1a75af41a0d284996 Mon Sep 17 00:00:00 2001 From: Jarek Potiuk Date: Mon, 4 Nov 2024 09:59:29 +0100 Subject: [PATCH 05/44] Allow to switch breeze to use uv internally to create virtualenvs (#43587) (#43624) Breeze sometimes creates "internal" virtualenvs in local ".build" directory when it needs - for example in order to run k8s tests or for release management commands. This PR adds capability to switch breeze to use `uv` instead of `pip` to install depdendencies in those envs. You can now switch breeze to use uv by `breeze setup config --use-uv` and switch back to pip by `breeze setup config --no-use-uv`. (cherry picked from commit a2a0ef09357f278bde031092e395f13286fd3076) --- .github/workflows/ci.yml | 1 + .github/workflows/k8s-tests.yml | 7 ++++ Dockerfile | 4 +- Dockerfile.ci | 8 ++-- dev/breeze/doc/ci/02_images.md | 4 +- dev/breeze/doc/images/output-commands.svg | 42 +++++++++---------- dev/breeze/doc/images/output_setup_config.svg | 32 +++++++------- dev/breeze/doc/images/output_setup_config.txt | 2 +- .../commands/release_management_commands.py | 10 +++-- .../airflow_breeze/commands/setup_commands.py | 31 +++++++++++--- .../commands/setup_commands_config.py | 1 + .../src/airflow_breeze/global_constants.py | 3 +- .../airflow_breeze/utils/kubernetes_utils.py | 27 +++++++++--- .../src/airflow_breeze/utils/run_tests.py | 6 ++- .../airflow_breeze/utils/virtualenv_utils.py | 30 +++++++++++-- scripts/ci/install_breeze.sh | 2 +- scripts/ci/pre_commit/update_installers.py | 6 ++- 17 files changed, 150 insertions(+), 66 deletions(-) diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index 866f8f253d401..c94518489d28a 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -622,6 +622,7 @@ jobs: kubernetes-versions-list-as-string: ${{ needs.build-info.outputs.kubernetes-versions-list-as-string }} kubernetes-combos-list-as-string: ${{ needs.build-info.outputs.kubernetes-combos-list-as-string }} include-success-outputs: ${{ needs.build-info.outputs.include-success-outputs }} + use-uv: ${{ needs.build-info.outputs.force-pip && 'false' || 'true' }} debug-resources: ${{ needs.build-info.outputs.debug-resources }} if: > ( needs.build-info.outputs.run-kubernetes-tests == 'true' || diff --git a/.github/workflows/k8s-tests.yml b/.github/workflows/k8s-tests.yml index c4b72a9afc924..9a764e88c4e99 100644 --- a/.github/workflows/k8s-tests.yml +++ b/.github/workflows/k8s-tests.yml @@ -44,6 +44,10 @@ on: # yamllint disable-line rule:truthy description: "Whether to include success outputs" required: true type: string + use-uv: + description: "Whether to use uv" + required: true + type: string debug-resources: description: "Whether to debug resources" required: true @@ -96,6 +100,9 @@ jobs: key: "\ k8s-env-${{ steps.breeze.outputs.host-python-version }}-\ ${{ hashFiles('scripts/ci/kubernetes/k8s_requirements.txt','hatch_build.py') }}" + - name: "Switch breeze to use uv" + run: breeze setup-config --use-uv + if: inputs.use-uv == 'true' - name: Run complete K8S tests ${{ inputs.kubernetes-combos-list-as-string }} run: breeze k8s run-complete-tests --run-in-parallel --upgrade --no-copy-local-sources env: diff --git a/Dockerfile b/Dockerfile index cf5226c00086f..4cdf1a8bb3409 100644 --- a/Dockerfile +++ b/Dockerfile @@ -49,8 +49,8 @@ ARG AIRFLOW_VERSION="2.9.3" ARG PYTHON_BASE_IMAGE="python:3.8-slim-bookworm" -ARG AIRFLOW_PIP_VERSION=24.2 -ARG AIRFLOW_UV_VERSION=0.4.1 +ARG AIRFLOW_PIP_VERSION=24.3.1 +ARG AIRFLOW_UV_VERSION=0.4.29 ARG AIRFLOW_USE_UV="false" ARG UV_HTTP_TIMEOUT="300" ARG AIRFLOW_IMAGE_REPOSITORY="https://github.com/apache/airflow" diff --git a/Dockerfile.ci b/Dockerfile.ci index d23e810fa3677..e188a7ec39115 100644 --- a/Dockerfile.ci +++ b/Dockerfile.ci @@ -1297,8 +1297,8 @@ ARG DEFAULT_CONSTRAINTS_BRANCH="constraints-main" # It can also be overwritten manually by setting the AIRFLOW_CI_BUILD_EPOCH environment variable. ARG AIRFLOW_CI_BUILD_EPOCH="10" ARG AIRFLOW_PRE_CACHED_PIP_PACKAGES="true" -ARG AIRFLOW_PIP_VERSION=24.2 -ARG AIRFLOW_UV_VERSION=0.4.1 +ARG AIRFLOW_PIP_VERSION=24.3.1 +ARG AIRFLOW_UV_VERSION=0.4.29 ARG AIRFLOW_USE_UV="true" # Setup PIP # By default PIP install run without cache to make image smaller @@ -1321,8 +1321,8 @@ ARG AIRFLOW_VERSION="" # Additional PIP flags passed to all pip install commands except reinstalling pip itself ARG ADDITIONAL_PIP_INSTALL_FLAGS="" -ARG AIRFLOW_PIP_VERSION=24.2 -ARG AIRFLOW_UV_VERSION=0.4.1 +ARG AIRFLOW_PIP_VERSION=24.3.1 +ARG AIRFLOW_UV_VERSION=0.4.29 ARG AIRFLOW_USE_UV="true" ENV AIRFLOW_REPO=${AIRFLOW_REPO}\ diff --git a/dev/breeze/doc/ci/02_images.md b/dev/breeze/doc/ci/02_images.md index f9ea4faaee7c0..1db263f8b3aa0 100644 --- a/dev/breeze/doc/ci/02_images.md +++ b/dev/breeze/doc/ci/02_images.md @@ -447,8 +447,8 @@ can be used for CI images: | `DEV_APT_DEPS` | | Dev APT dependencies installed in the first part of the image | | `ADDITIONAL_DEV_APT_DEPS` | | Additional apt dev dependencies installed in the first part of the image | | `ADDITIONAL_DEV_APT_ENV` | | Additional env variables defined when installing dev deps | -| `AIRFLOW_PIP_VERSION` | `24.0` | PIP version used. | -| `AIRFLOW_UV_VERSION` | `0.1.10` | UV version used. | +| `AIRFLOW_PIP_VERSION` | `24.3.1` | PIP version used. | +| `AIRFLOW_UV_VERSION` | `0.4.29` | UV version used. | | `AIRFLOW_USE_UV` | `true` | Whether to use UV for installation. | | `PIP_PROGRESS_BAR` | `on` | Progress bar for PIP installation | diff --git a/dev/breeze/doc/images/output-commands.svg b/dev/breeze/doc/images/output-commands.svg index 08d3dc2a13eea..5888d1fc862eb 100644 --- a/dev/breeze/doc/images/output-commands.svg +++ b/dev/breeze/doc/images/output-commands.svg @@ -298,53 +298,53 @@ Usage:breeze[OPTIONSCOMMAND [ARGS]... ╭─ Execution mode ─────────────────────────────────────────────────────────────────────────────────────────────────────╮ ---python-pPython major/minor version used in Airflow image for images. +--python-pPython major/minor version used in Airflow image for images. (>3.8< | 3.9 | 3.10 | 3.11 | 3.12)                           [default: 3.8]                                               ---integrationIntegration(s) to enable when running (can be more than one).                        +--integrationIntegration(s) to enable when running (can be more than one).                        (all | all-testable | cassandra | celery | drill | kafka | kerberos | mongo | mssql  | openlineage | otel | pinot | qdrant | redis | statsd | trino | ydb)                ---standalone-dag-processorRun standalone dag processor for start-airflow. ---database-isolationRun airflow in database isolation mode. +--standalone-dag-processorRun standalone dag processor for start-airflow. +--database-isolationRun airflow in database isolation mode. ╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ ╭─ Docker Compose selection and cleanup ───────────────────────────────────────────────────────────────────────────────╮ ---project-nameName of the docker-compose project to bring down. The `docker-compose` is for legacy breeze        -project name and you can use `breeze down --project-name docker-compose` to stop all containers    +--project-nameName of the docker-compose project to bring down. The `docker-compose` is for legacy breeze        +project name and you can use `breeze down --project-name docker-compose` to stop all containers    belonging to it.                                                                                   (breeze | pre-commit | docker-compose)                                                             [default: breeze]                                                                                  ---docker-hostOptional - docker host to use when running docker commands. When set, the `--builder` option is    +--docker-hostOptional - docker host to use when running docker commands. When set, the `--builder` option is    ignored when building images.                                                                      (TEXT)                                                                                             ╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ ╭─ Database ───────────────────────────────────────────────────────────────────────────────────────────────────────────╮ ---backend-bDatabase backend to use. If 'none' is chosen, Breeze will start with an invalid database     +--backend-bDatabase backend to use. If 'none' is chosen, Breeze will start with an invalid database     configuration, meaning there will be no database available, and any attempts to connect to   the Airflow database will fail.                                                              (>sqlite< | mysql | postgres | none)                                                         [default: sqlite]                                                                            ---postgres-version-PVersion of Postgres used.(>12< | 13 | 14 | 15 | 16)[default: 12] ---mysql-version-MVersion of MySQL used.(>8.0< | 8.4)[default: 8.0] ---db-reset-dReset DB when entering the container. +--postgres-version-PVersion of Postgres used.(>12< | 13 | 14 | 15 | 16)[default: 12] +--mysql-version-MVersion of MySQL used.(>8.0< | 8.4)[default: 8.0] +--db-reset-dReset DB when entering the container. ╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ ╭─ Build CI image (before entering shell) ─────────────────────────────────────────────────────────────────────────────╮ ---github-repository-gGitHub repository used to pull, push run images.(TEXT)[default: apache/airflow] ---builderBuildx builder used to perform `docker buildx build` commands.(TEXT) +--github-repository-gGitHub repository used to pull, push run images.(TEXT)[default: apache/airflow] +--builderBuildx builder used to perform `docker buildx build` commands.(TEXT) [default: autodetect]                                          ---use-uv/--no-use-uvUse uv instead of pip as packaging tool to build the image.[default: use-uv] ---uv-http-timeoutTimeout for requests that UV makes (only used in case of UV builds).(INTEGER RANGE) +--use-uv/--no-use-uvUse uv instead of pip as packaging tool to build the image.[default: use-uv] +--uv-http-timeoutTimeout for requests that UV makes (only used in case of UV builds).(INTEGER RANGE) [default: 300; x>=1]                                                 ╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ ╭─ Other options ──────────────────────────────────────────────────────────────────────────────────────────────────────╮ ---forward-credentials-fForward local credentials to container when running. ---max-timeMaximum time that the command should take - if it takes longer, the command will fail. +--forward-credentials-fForward local credentials to container when running. +--max-timeMaximum time that the command should take - if it takes longer, the command will fail. (INTEGER RANGE)                                                                        ╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ ╭─ Common options ─────────────────────────────────────────────────────────────────────────────────────────────────────╮ ---answer-aForce answer to questions.(y | n | q | yes | no | quit) ---dry-run-DIf dry-run is set, commands are only printed, not executed. ---verbose-vPrint verbose information about performed steps. ---help-hShow this message and exit. +--answer-aForce answer to questions.(y | n | q | yes | no | quit) +--dry-run-DIf dry-run is set, commands are only printed, not executed. +--verbose-vPrint verbose information about performed steps. +--help-hShow this message and exit. ╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ ╭─ Developer commands ─────────────────────────────────────────────────────────────────────────────────────────────────╮ start-airflow          Enter breeze environment and starts all Airflow components in the tmux session. Compile     diff --git a/dev/breeze/doc/images/output_setup_config.svg b/dev/breeze/doc/images/output_setup_config.svg index 9a42467ea5281..5a44bb20030b9 100644 --- a/dev/breeze/doc/images/output_setup_config.svg +++ b/dev/breeze/doc/images/output_setup_config.svg @@ -1,4 +1,4 @@ - + Show/update configuration (Python, Backend, Cheatsheet, ASCIIART). ╭─ Config flags ───────────────────────────────────────────────────────────────────────────────────────────────────────╮ ---python-pPython major/minor version used in Airflow image for images. +--python-pPython major/minor version used in Airflow image for images. (>3.8< | 3.9 | 3.10 | 3.11 | 3.12)                           [default: 3.8]                                               ---backend-bDatabase backend to use. If 'none' is chosen, Breeze will start with an invalid +--backend-bDatabase backend to use. If 'none' is chosen, Breeze will start with an invalid database configuration, meaning there will be no database available, and any    attempts to connect to the Airflow database will fail.                          (>sqlite< | mysql | postgres | none)                                            [default: sqlite]                                                               ---postgres-version-PVersion of Postgres used.(>12< | 13 | 14 | 15 | 16)[default: 12] ---mysql-version-MVersion of MySQL used.(>8.0< | 8.4)[default: 8.0] ---cheatsheet/--no-cheatsheet-C/-cEnable/disable cheatsheet. ---asciiart/--no-asciiart-A/-aEnable/disable ASCIIart. ---colour/--no-colourEnable/disable Colour mode (useful for colour blind-friendly communication). -╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ -╭─ Common options ─────────────────────────────────────────────────────────────────────────────────────────────────────╮ ---help-hShow this message and exit. -╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ +--postgres-version-PVersion of Postgres used.(>12< | 13 | 14 | 15 | 16)[default: 12] +--mysql-version-MVersion of MySQL used.(>8.0< | 8.4)[default: 8.0] +--use-uv/--no-use-uv-U/-uEnable/disable using uv for creating venvs by breeze. +--cheatsheet/--no-cheatsheet-C/-cEnable/disable cheatsheet. +--asciiart/--no-asciiart-A/-aEnable/disable ASCIIart. +--colour/--no-colourEnable/disable Colour mode (useful for colour blind-friendly communication). +╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ +╭─ Common options ─────────────────────────────────────────────────────────────────────────────────────────────────────╮ +--help-hShow this message and exit. +╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ diff --git a/dev/breeze/doc/images/output_setup_config.txt b/dev/breeze/doc/images/output_setup_config.txt index f47fa38e42c7b..3b2da9a9c043c 100644 --- a/dev/breeze/doc/images/output_setup_config.txt +++ b/dev/breeze/doc/images/output_setup_config.txt @@ -1 +1 @@ -422c8c524b557fcf5924da4c8590935d +96e10564034b282769a2c48ebf7176e2 diff --git a/dev/breeze/src/airflow_breeze/commands/release_management_commands.py b/dev/breeze/src/airflow_breeze/commands/release_management_commands.py index 33f3dd338e4d9..648a13d424658 100644 --- a/dev/breeze/src/airflow_breeze/commands/release_management_commands.py +++ b/dev/breeze/src/airflow_breeze/commands/release_management_commands.py @@ -226,8 +226,8 @@ class VersionedFile(NamedTuple): file_name: str -AIRFLOW_PIP_VERSION = "24.0" -AIRFLOW_UV_VERSION = "0.1.10" +AIRFLOW_PIP_VERSION = "24.3.1" +AIRFLOW_UV_VERSION = "0.4.29" AIRFLOW_USE_UV = False WHEEL_VERSION = "0.36.2" GITPYTHON_VERSION = "3.1.40" @@ -451,7 +451,11 @@ def _check_sdist_to_wheel_dists(dists_info: tuple[DistributionPackageInfo, ...]) continue if not venv_created: - python_path = create_venv(Path(tmp_dir_name) / ".venv", pip_version=AIRFLOW_PIP_VERSION) + python_path = create_venv( + Path(tmp_dir_name) / ".venv", + pip_version=AIRFLOW_PIP_VERSION, + uv_version=AIRFLOW_UV_VERSION, + ) pip_command = create_pip_command(python_path) venv_created = True diff --git a/dev/breeze/src/airflow_breeze/commands/setup_commands.py b/dev/breeze/src/airflow_breeze/commands/setup_commands.py index 407ff7f8cdf3f..bc1ac4f1fa56b 100644 --- a/dev/breeze/src/airflow_breeze/commands/setup_commands.py +++ b/dev/breeze/src/airflow_breeze/commands/setup_commands.py @@ -192,6 +192,12 @@ def version(): @option_mysql_version @click.option("-C/-c", "--cheatsheet/--no-cheatsheet", help="Enable/disable cheatsheet.", default=None) @click.option("-A/-a", "--asciiart/--no-asciiart", help="Enable/disable ASCIIart.", default=None) +@click.option( + "-U/-u", + "--use-uv/--no-use-uv", + help="Enable/disable using uv for creating venvs by breeze.", + default=None, +) @click.option( "--colour/--no-colour", help="Enable/disable Colour mode (useful for colour blind-friendly communication).", @@ -200,6 +206,7 @@ def version(): def change_config( python: str, backend: str, + use_uv: bool, postgres_version: str, mysql_version: str, cheatsheet: bool, @@ -212,14 +219,22 @@ def change_config( asciiart_file = "suppress_asciiart" cheatsheet_file = "suppress_cheatsheet" colour_file = "suppress_colour" + use_uv_file = "use_uv" + if use_uv is not None: + if use_uv: + touch_cache_file(use_uv_file) + get_console().print("[info]Enable using uv[/]") + else: + delete_cache(use_uv_file) + get_console().print("[info]Disable using uv[/]") if asciiart is not None: if asciiart: delete_cache(asciiart_file) - get_console().print("[info]Enable ASCIIART![/]") + get_console().print("[info]Enable ASCIIART[/]") else: touch_cache_file(asciiart_file) - get_console().print("[info]Disable ASCIIART![/]") + get_console().print("[info]Disable ASCIIART[/]") if cheatsheet is not None: if cheatsheet: delete_cache(cheatsheet_file) @@ -235,23 +250,27 @@ def change_config( touch_cache_file(colour_file) get_console().print("[info]Disable Colour[/]") - def get_status(file: str): + def get_supress_status(file: str): return "disabled" if check_if_cache_exists(file) else "enabled" + def get_status(file: str): + return "enabled" if check_if_cache_exists(file) else "disabled" + get_console().print() get_console().print("[info]Current configuration:[/]") get_console().print() get_console().print(f"[info]* Python: {python}[/]") get_console().print(f"[info]* Backend: {backend}[/]") + get_console().print(f"[info]* Use uv: {get_status(use_uv_file)}[/]") get_console().print() get_console().print(f"[info]* Postgres version: {postgres_version}[/]") get_console().print(f"[info]* MySQL version: {mysql_version}[/]") get_console().print() - get_console().print(f"[info]* ASCIIART: {get_status(asciiart_file)}[/]") - get_console().print(f"[info]* Cheatsheet: {get_status(cheatsheet_file)}[/]") + get_console().print(f"[info]* ASCIIART: {get_supress_status(asciiart_file)}[/]") + get_console().print(f"[info]* Cheatsheet: {get_supress_status(cheatsheet_file)}[/]") get_console().print() get_console().print() - get_console().print(f"[info]* Colour: {get_status(colour_file)}[/]") + get_console().print(f"[info]* Colour: {get_supress_status(colour_file)}[/]") get_console().print() diff --git a/dev/breeze/src/airflow_breeze/commands/setup_commands_config.py b/dev/breeze/src/airflow_breeze/commands/setup_commands_config.py index 61460f004ec9f..802a41fc273dd 100644 --- a/dev/breeze/src/airflow_breeze/commands/setup_commands_config.py +++ b/dev/breeze/src/airflow_breeze/commands/setup_commands_config.py @@ -63,6 +63,7 @@ "--backend", "--postgres-version", "--mysql-version", + "--use-uv", "--cheatsheet", "--asciiart", "--colour", diff --git a/dev/breeze/src/airflow_breeze/global_constants.py b/dev/breeze/src/airflow_breeze/global_constants.py index 944870122747d..791e07cfe718f 100644 --- a/dev/breeze/src/airflow_breeze/global_constants.py +++ b/dev/breeze/src/airflow_breeze/global_constants.py @@ -155,7 +155,8 @@ ALLOWED_INSTALL_MYSQL_CLIENT_TYPES = ["mariadb", "mysql"] -PIP_VERSION = "24.0" +PIP_VERSION = "24.3.1" +UV_VERSION = "0.4.29" DEFAULT_UV_HTTP_TIMEOUT = 300 DEFAULT_WSL2_HTTP_TIMEOUT = 900 diff --git a/dev/breeze/src/airflow_breeze/utils/kubernetes_utils.py b/dev/breeze/src/airflow_breeze/utils/kubernetes_utils.py index 69703b4692b50..3aca9d51c130c 100644 --- a/dev/breeze/src/airflow_breeze/utils/kubernetes_utils.py +++ b/dev/breeze/src/airflow_breeze/utils/kubernetes_utils.py @@ -41,12 +41,15 @@ HELM_VERSION, KIND_VERSION, PIP_VERSION, + UV_VERSION, ) +from airflow_breeze.utils.cache import check_if_cache_exists from airflow_breeze.utils.console import Output, get_console from airflow_breeze.utils.host_info_utils import Architecture, get_host_architecture, get_host_os from airflow_breeze.utils.path_utils import AIRFLOW_SOURCES_ROOT, BUILD_CACHE_DIR from airflow_breeze.utils.run_utils import RunCommandResult, run_command from airflow_breeze.utils.shared_options import get_dry_run, get_verbose +from airflow_breeze.utils.virtualenv_utils import create_pip_command, create_uv_command K8S_ENV_PATH = BUILD_CACHE_DIR / ".k8s-env" K8S_CLUSTERS_PATH = BUILD_CACHE_DIR / ".k8s-clusters" @@ -301,10 +304,12 @@ def _requirements_changed() -> bool: def _install_packages_in_k8s_virtualenv(): + if check_if_cache_exists("use_uv"): + command = create_uv_command(PYTHON_BIN_PATH) + else: + command = create_pip_command(PYTHON_BIN_PATH) install_command_no_constraints = [ - str(PYTHON_BIN_PATH), - "-m", - "pip", + *command, "install", "-r", str(K8S_REQUIREMENTS_PATH.resolve()), @@ -405,8 +410,9 @@ def create_virtualenv(force_venv_setup: bool) -> RunCommandResult: ) return venv_command_result get_console().print(f"[info]Reinstalling PIP version in {K8S_ENV_PATH}") + command = create_pip_command(PYTHON_BIN_PATH) pip_reinstall_result = run_command( - [str(PYTHON_BIN_PATH), "-m", "pip", "install", f"pip=={PIP_VERSION}"], + [*command, "install", f"pip=={PIP_VERSION}"], check=False, capture_output=True, ) @@ -416,8 +422,19 @@ def create_virtualenv(force_venv_setup: bool) -> RunCommandResult: f"{pip_reinstall_result.stdout}\n{pip_reinstall_result.stderr}" ) return pip_reinstall_result - get_console().print(f"[info]Installing necessary packages in {K8S_ENV_PATH}") + uv_reinstall_result = run_command( + [*command, "install", f"uv=={UV_VERSION}"], + check=False, + capture_output=True, + ) + if uv_reinstall_result.returncode != 0: + get_console().print( + f"[error]Error when updating uv to {UV_VERSION}:[/]\n" + f"{uv_reinstall_result.stdout}\n{uv_reinstall_result.stderr}" + ) + return uv_reinstall_result + get_console().print(f"[info]Installing necessary packages in {K8S_ENV_PATH}") install_packages_result = _install_packages_in_k8s_virtualenv() if install_packages_result.returncode == 0: if get_dry_run(): diff --git a/dev/breeze/src/airflow_breeze/utils/run_tests.py b/dev/breeze/src/airflow_breeze/utils/run_tests.py index 73cbb430817cc..b34fa3b341020 100644 --- a/dev/breeze/src/airflow_breeze/utils/run_tests.py +++ b/dev/breeze/src/airflow_breeze/utils/run_tests.py @@ -22,7 +22,7 @@ from itertools import chain from subprocess import DEVNULL -from airflow_breeze.global_constants import PIP_VERSION +from airflow_breeze.global_constants import PIP_VERSION, UV_VERSION from airflow_breeze.utils.console import Output, get_console from airflow_breeze.utils.packages import get_excluded_provider_folders, get_suspended_provider_folders from airflow_breeze.utils.path_utils import AIRFLOW_SOURCES_ROOT @@ -59,7 +59,9 @@ def verify_an_image( env["DOCKER_IMAGE"] = image_name if slim_image: env["TEST_SLIM_IMAGE"] = "true" - with create_temp_venv(pip_version=PIP_VERSION, requirements_file=DOCKER_TESTS_REQUIREMENTS) as py_exe: + with create_temp_venv( + pip_version=PIP_VERSION, uv_version=UV_VERSION, requirements_file=DOCKER_TESTS_REQUIREMENTS + ) as py_exe: command_result = run_command( [py_exe, "-m", "pytest", str(test_path), *pytest_args, *extra_pytest_args], env=env, diff --git a/dev/breeze/src/airflow_breeze/utils/virtualenv_utils.py b/dev/breeze/src/airflow_breeze/utils/virtualenv_utils.py index 0288e49b90975..3c6a175a0fcd0 100644 --- a/dev/breeze/src/airflow_breeze/utils/virtualenv_utils.py +++ b/dev/breeze/src/airflow_breeze/utils/virtualenv_utils.py @@ -23,6 +23,7 @@ from pathlib import Path from typing import Generator +from airflow_breeze.utils.cache import check_if_cache_exists from airflow_breeze.utils.console import get_console from airflow_breeze.utils.run_utils import run_command @@ -31,10 +32,15 @@ def create_pip_command(python: str | Path) -> list[str]: return [python.as_posix() if hasattr(python, "as_posix") else str(python), "-m", "pip"] +def create_uv_command(python: str | Path) -> list[str]: + return [python.as_posix() if hasattr(python, "as_posix") else str(python), "-m", "uv", "pip"] + + def create_venv( venv_path: str | Path, python: str | None = None, pip_version: str | None = None, + uv_version: str | None = None, requirements_file: str | Path | None = None, ) -> str: venv_path = Path(venv_path).resolve().absolute() @@ -53,10 +59,13 @@ def create_venv( if not python_path.exists(): get_console().print(f"\n[errors]Python interpreter is not exist in path {python_path}. Exiting!\n") sys.exit(1) - pip_command = create_pip_command(python_path) + if check_if_cache_exists("use_uv"): + command = create_uv_command(python_path) + else: + command = create_pip_command(python_path) if pip_version: result = run_command( - [*pip_command, "install", f"pip=={pip_version}", "-q"], + [*command, "install", f"pip=={pip_version}", "-q"], check=False, capture_output=False, text=True, @@ -67,10 +76,23 @@ def create_venv( f"{result.stdout}\n{result.stderr}" ) sys.exit(result.returncode) + if uv_version: + result = run_command( + [*command, "install", f"uv=={uv_version}", "-q"], + check=False, + capture_output=False, + text=True, + ) + if result.returncode != 0: + get_console().print( + f"[error]Error when installing uv in {venv_path.as_posix()}[/]\n" + f"{result.stdout}\n{result.stderr}" + ) + sys.exit(result.returncode) if requirements_file: requirements_file = Path(requirements_file).absolute().as_posix() result = run_command( - [*pip_command, "install", "-r", requirements_file, "-q"], + [*command, "install", "-r", requirements_file, "-q"], check=True, capture_output=False, text=True, @@ -88,6 +110,7 @@ def create_venv( def create_temp_venv( python: str | None = None, pip_version: str | None = None, + uv_version: str | None = None, requirements_file: str | Path | None = None, prefix: str | None = None, ) -> Generator[str, None, None]: @@ -96,5 +119,6 @@ def create_temp_venv( Path(tmp_dir_name) / ".venv", python=python, pip_version=pip_version, + uv_version=uv_version, requirements_file=requirements_file, ) diff --git a/scripts/ci/install_breeze.sh b/scripts/ci/install_breeze.sh index 5ffd604670b0a..aa5a3160060bf 100755 --- a/scripts/ci/install_breeze.sh +++ b/scripts/ci/install_breeze.sh @@ -25,7 +25,7 @@ if [[ ${PYTHON_VERSION=} != "" ]]; then PYTHON_ARG="--python=$(which python"${PYTHON_VERSION}") " fi -python -m pip install --upgrade pip==24.0 +python -m pip install --upgrade pip==24.3.1 python -m pip install "pipx>=1.4.1" python -m pipx uninstall apache-airflow-breeze >/dev/null 2>&1 || true # shellcheck disable=SC2086 diff --git a/scripts/ci/pre_commit/update_installers.py b/scripts/ci/pre_commit/update_installers.py index a90e07d38c9f3..f55a937df0cdf 100755 --- a/scripts/ci/pre_commit/update_installers.py +++ b/scripts/ci/pre_commit/update_installers.py @@ -65,6 +65,7 @@ def get_latest_pypi_version(package_name: str) -> str: AIRFLOW_UV_PATTERN = re.compile(r"(AIRFLOW_UV_VERSION=)([0-9.]+)") AIRFLOW_UV_QUOTED_PATTERN = re.compile(r"(AIRFLOW_UV_VERSION = )(\"[0-9.]+\")") +UV_QUOTED_PATTERN = re.compile(r"(UV_VERSION = )(\"[0-9.]+\")") AIRFLOW_UV_DOC_PATTERN = re.compile(r"(\| *`AIRFLOW_UV_VERSION` *\| *)(`[0-9.]+`)( *\|)") UV_GREATER_PATTERN = re.compile(r'"(uv>=)([0-9]+)"') @@ -118,11 +119,14 @@ def replacer(match): new_content = replace_group_2_while_keeping_total_length( AIRFLOW_UV_PATTERN, uv_version, new_content ) + new_content = replace_group_2_while_keeping_total_length( + UV_GREATER_PATTERN, uv_version, new_content + ) new_content = replace_group_2_while_keeping_total_length( AIRFLOW_UV_QUOTED_PATTERN, f'"{uv_version}"', new_content ) new_content = replace_group_2_while_keeping_total_length( - UV_GREATER_PATTERN, uv_version, new_content + UV_QUOTED_PATTERN, f'"{uv_version}"', new_content ) if new_content != file_content: file.write_text(new_content) From acaf59dcfb88d73321b3d5650720cd299503918a Mon Sep 17 00:00:00 2001 From: Jarek Potiuk Date: Mon, 4 Nov 2024 17:52:53 +0100 Subject: [PATCH 06/44] Make Breeze installation and reinstallation support both uv and pipx (#43607) (#43626) So far `breeze` fully supported only `pipx` installation. For example it would reinstall itself automatically with pipx if you attempted to run it from another workspace/checked out repository of Airflow, and it only provided instructions for pipx. With this PR: * the `uv tool` is preferred way to install breeze * the `pipx` is PSF-governance managed alternative * breeze will reinstall itself using uv if it has been installed with uv before when it is run from a different workspace or different airflow repo checked out in another folder Also documentation is updated to make `uv` the recommended tool and describing how to install it - with `pipx` provided as an alternative. Warning is printed in case pre-commit-uv is not installed with the pre-commit (pre-commit-uv significantly speeds up installation of the venvs by pre-commit). This warning also provides instructions how to install it. (cherry picked from commit ddc5670a8c6f2facb490d3f8de297fb7705d3887) --- .github/actions/install-pre-commit/action.yml | 50 + .../workflows/additional-ci-image-checks.yml | 8 +- .github/workflows/basic-tests.yml | 13 +- .github/workflows/build-images.yml | 23 +- .github/workflows/check-providers.yml | 5 + .github/workflows/ci.yml | 12 +- .github/workflows/finalize-tests.yml | 2 +- .github/workflows/generate-constraints.yml | 4 +- .github/workflows/integration-tests.yml | 1 + .github/workflows/k8s-tests.yml | 2 +- .github/workflows/static-checks-mypy-docs.yml | 43 +- .gitignore | 6 +- .../03_contributors_quick_start.rst | 33 + dev/breeze/doc/01_installation.rst | 25 +- dev/breeze/doc/ci/04_selective_checks.md | 2 +- .../commands/developer_commands.py | 4 +- .../commands/release_management_commands.py | 13 +- .../airflow_breeze/utils/kubernetes_utils.py | 5 +- .../src/airflow_breeze/utils/path_utils.py | 13 +- .../airflow_breeze/utils/python_versions.py | 7 +- .../src/airflow_breeze/utils/reinstall.py | 13 +- .../src/airflow_breeze/utils/run_utils.py | 22 +- .../airflow_breeze/utils/selective_checks.py | 43 +- dev/breeze/tests/test_selective_checks.py | 104 +- dev/breeze/uv.lock | 1731 +++++++++++++++++ hatch_build.py | 5 +- pyproject.toml | 2 +- .../ci/pre_commit/common_precommit_utils.py | 7 +- 28 files changed, 2043 insertions(+), 155 deletions(-) create mode 100644 .github/actions/install-pre-commit/action.yml create mode 100644 dev/breeze/uv.lock diff --git a/.github/actions/install-pre-commit/action.yml b/.github/actions/install-pre-commit/action.yml new file mode 100644 index 0000000000000..02eea2c722917 --- /dev/null +++ b/.github/actions/install-pre-commit/action.yml @@ -0,0 +1,50 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +# +--- +name: 'Install pre-commit' +description: 'Installs pre-commit and related packages' +inputs: + python-version: + description: 'Python version to use' + default: 3.9 + uv-version: + description: 'uv version to use' + default: 0.4.29 + pre-commit-version: + description: 'pre-commit version to use' + default: 4.0.1 + pre-commit-uv-version: + description: 'pre-commit-uv version to use' + default: 4.1.4 +runs: + using: "composite" + steps: + - name: Install pre-commit, uv, and pre-commit-uv + shell: bash + run: > + pip install + pre-commit==${{inputs.pre-commit-version}} + uv==${{inputs.uv-version}} + pre-commit-uv==${{inputs.pre-commit-uv-version}} + - name: Cache pre-commit envs + uses: actions/cache@v4 + with: + path: ~/.cache/pre-commit + key: "pre-commit-${{inputs.python-version}}-${{ hashFiles('.pre-commit-config.yaml') }}" + restore-keys: | + pre-commit-${{inputs.python-version}}- diff --git a/.github/workflows/additional-ci-image-checks.yml b/.github/workflows/additional-ci-image-checks.yml index ae9efdb6b0340..82b143d2f03e4 100644 --- a/.github/workflows/additional-ci-image-checks.yml +++ b/.github/workflows/additional-ci-image-checks.yml @@ -80,6 +80,10 @@ on: # yamllint disable-line rule:truthy description: "Whether to debug resources (true/false)" required: true type: string + use-uv: + description: "Whether to use uv to build the image (true/false)" + required: true + type: string jobs: # Push early BuildX cache to GitHub Registry in Apache repository, This cache does not wait for all the # tests to complete - it is run very early in the build process for "main" merges in order to refresh @@ -109,7 +113,7 @@ jobs: python-versions: ${{ inputs.python-versions }} branch: ${{ inputs.branch }} constraints-branch: ${{ inputs.constraints-branch }} - use-uv: "true" + use-uv: ${{ inputs.use-uv}} include-success-outputs: ${{ inputs.include-success-outputs }} docker-cache: ${{ inputs.docker-cache }} if: inputs.branch == 'main' @@ -165,6 +169,6 @@ jobs: # platform: "linux/arm64" # branch: ${{ inputs.branch }} # constraints-branch: ${{ inputs.constraints-branch }} -# use-uv: "true" +# use-uv: ${{ inputs.use-uv}} # upgrade-to-newer-dependencies: ${{ inputs.upgrade-to-newer-dependencies }} # docker-cache: ${{ inputs.docker-cache }} diff --git a/.github/workflows/basic-tests.yml b/.github/workflows/basic-tests.yml index 7ab09d1cd2fec..5141feae22380 100644 --- a/.github/workflows/basic-tests.yml +++ b/.github/workflows/basic-tests.yml @@ -232,16 +232,11 @@ jobs: - name: "Install Breeze" uses: ./.github/actions/breeze id: breeze - - name: Cache pre-commit envs - uses: actions/cache@v4 + - name: "Install pre-commit" + uses: ./.github/actions/install-pre-commit + id: pre-commit with: - path: ~/.cache/pre-commit - # yamllint disable-line rule:line-length - key: "pre-commit-${{steps.breeze.outputs.host-python-version}}-${{ hashFiles('.pre-commit-config.yaml') }}" - restore-keys: "\ - pre-commit-${{steps.breeze.outputs.host-python-version}}-\ - ${{ hashFiles('.pre-commit-config.yaml') }}\n - pre-commit-${{steps.breeze.outputs.host-python-version}}-" + python-version: ${{steps.breeze.outputs.host-python-version}} - name: Fetch incoming commit ${{ github.sha }} with its parent uses: actions/checkout@v4 with: diff --git a/.github/workflows/build-images.yml b/.github/workflows/build-images.yml index abf966faede02..55e6c5d2018b9 100644 --- a/.github/workflows/build-images.yml +++ b/.github/workflows/build-images.yml @@ -16,7 +16,7 @@ # under the License. # --- -name: "Build Images" +name: Build Images run-name: > Build images for ${{ github.event.pull_request.title }} ${{ github.event.pull_request._links.html.href }} on: # yamllint disable-line rule:truthy @@ -54,7 +54,7 @@ concurrency: jobs: build-info: timeout-minutes: 10 - name: "Build Info" + name: Build Info # At build-info stage we do not yet have outputs so we need to hard-code the runs-on to public runners runs-on: ["ubuntu-22.04"] env: @@ -71,6 +71,7 @@ jobs: prod-image-build: ${{ steps.selective-checks.outputs.prod-image-build }} docker-cache: ${{ steps.selective-checks.outputs.docker-cache }} default-branch: ${{ steps.selective-checks.outputs.default-branch }} + force-pip: ${{ steps.selective-checks.outputs.force-pip }} constraints-branch: ${{ steps.selective-checks.outputs.default-constraints-branch }} runs-on-as-json-default: ${{ steps.selective-checks.outputs.runs-on-as-json-default }} runs-on-as-json-public: ${{ steps.selective-checks.outputs.runs-on-as-json-public }} @@ -89,7 +90,7 @@ jobs: }}" if: github.repository == 'apache/airflow' steps: - - name: "Cleanup repo" + - name: Cleanup repo shell: bash run: docker run -v "${GITHUB_WORKSPACE}:/workspace" -u 0:0 bash -c "rm -rf /workspace/*" - name: Discover PR merge commit @@ -154,13 +155,13 @@ jobs: # COMPOSITE ACTIONS. WE CAN RUN ANYTHING THAT IS IN THE TARGET BRANCH AND THERE IS NO RISK THAT # CODE WILL BE RUN FROM THE PR. #################################################################################################### - - name: "Cleanup docker" + - name: Cleanup docker run: ./scripts/ci/cleanup_docker.sh - - name: "Setup python" + - name: Setup python uses: actions/setup-python@v5 with: - python-version: 3.8 - - name: "Install Breeze" + python-version: "3.9" + - name: Install Breeze uses: ./.github/actions/breeze #################################################################################################### # WE RUN SELECTIVE CHECKS HERE USING THE TARGET COMMIT AND ITS PARENT TO BE ABLE TO COMPARE THEM @@ -202,7 +203,7 @@ jobs: pull-request-target: "true" is-committer-build: ${{ needs.build-info.outputs.is-committer-build }} push-image: "true" - use-uv: "true" + use-uv: ${{ needs.build-info.outputs.force-pip && 'false' || 'true' }} image-tag: ${{ needs.build-info.outputs.image-tag }} platform: "linux/amd64" python-versions: ${{ needs.build-info.outputs.python-versions }} @@ -212,7 +213,7 @@ jobs: docker-cache: ${{ needs.build-info.outputs.docker-cache }} generate-constraints: - name: "Generate constraints" + name: Generate constraints needs: [build-info, build-ci-images] uses: ./.github/workflows/generate-constraints.yml with: @@ -245,9 +246,9 @@ jobs: pull-request-target: "true" is-committer-build: ${{ needs.build-info.outputs.is-committer-build }} push-image: "true" - use-uv: "true" + use-uv: ${{ needs.build-info.outputs.force-pip && 'false' || 'true' }} image-tag: ${{ needs.build-info.outputs.image-tag }} - platform: "linux/amd64" + platform: linux/amd64 python-versions: ${{ needs.build-info.outputs.python-versions }} default-python-version: ${{ needs.build-info.outputs.default-python-version }} branch: ${{ needs.build-info.outputs.default-branch }} diff --git a/.github/workflows/check-providers.yml b/.github/workflows/check-providers.yml index 622a67fea97a1..e89d4a81faaca 100644 --- a/.github/workflows/check-providers.yml +++ b/.github/workflows/check-providers.yml @@ -28,6 +28,10 @@ on: # yamllint disable-line rule:truthy description: "Tag to set for the image" required: true type: string + canary-run: + description: "Whether this is a canary run" + required: true + type: string default-python-version: description: "Which version of python should be used by default" required: true @@ -209,6 +213,7 @@ jobs: PYTHON_MAJOR_MINOR_VERSION: "${{ inputs.default-python-version }}" VERSION_SUFFIX_FOR_PYPI: "dev0" VERBOSE: "true" + CLEAN_AIRFLOW_INSTALLATION: "${{ inputs.canary-run }}" if: inputs.skip-provider-tests != 'true' steps: - name: "Cleanup repo" diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index c94518489d28a..4da5ca6c8ae8b 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -75,6 +75,7 @@ jobs: default-mysql-version: ${{ steps.selective-checks.outputs.default-mysql-version }} default-helm-version: ${{ steps.selective-checks.outputs.default-helm-version }} default-kind-version: ${{ steps.selective-checks.outputs.default-kind-version }} + force-pip: ${{ steps.selective-checks.outputs.force-pip }} full-tests-needed: ${{ steps.selective-checks.outputs.full-tests-needed }} parallel-test-types-list-as-string: >- ${{ steps.selective-checks.outputs.parallel-test-types-list-as-string }} @@ -95,7 +96,7 @@ jobs: ci-image-build: ${{ steps.selective-checks.outputs.ci-image-build }} prod-image-build: ${{ steps.selective-checks.outputs.prod-image-build }} docs-build: ${{ steps.selective-checks.outputs.docs-build }} - mypy-folders: ${{ steps.selective-checks.outputs.mypy-folders }} + mypy-checks: ${{ steps.selective-checks.outputs.mypy-checks }} needs-mypy: ${{ steps.selective-checks.outputs.needs-mypy }} needs-helm-tests: ${{ steps.selective-checks.outputs.needs-helm-tests }} needs-api-tests: ${{ steps.selective-checks.outputs.needs-api-tests }} @@ -199,7 +200,7 @@ jobs: platform: "linux/amd64" python-versions: ${{ needs.build-info.outputs.python-versions }} branch: ${{ needs.build-info.outputs.default-branch }} - use-uv: "true" + use-uv: ${{ needs.build-info.outputs.force-pip && 'false' || 'true' }} upgrade-to-newer-dependencies: ${{ needs.build-info.outputs.upgrade-to-newer-dependencies }} constraints-branch: ${{ needs.build-info.outputs.default-constraints-branch }} docker-cache: ${{ needs.build-info.outputs.docker-cache }} @@ -263,6 +264,7 @@ jobs: latest-versions-only: ${{ needs.build-info.outputs.latest-versions-only }} include-success-outputs: ${{ needs.build-info.outputs.include-success-outputs }} debug-resources: ${{ needs.build-info.outputs.debug-resources }} + use-uv: ${{ needs.build-info.outputs.force-pip && 'false' || 'true' }} generate-constraints: @@ -290,7 +292,7 @@ jobs: runs-on-as-json-docs-build: ${{ needs.build-info.outputs.runs-on-as-json-docs-build }} image-tag: ${{ needs.build-info.outputs.image-tag }} needs-mypy: ${{ needs.build-info.outputs.needs-mypy }} - mypy-folders: ${{ needs.build-info.outputs.mypy-folders }} + mypy-checks: ${{ needs.build-info.outputs.mypy-checks }} python-versions-list-as-string: ${{ needs.build-info.outputs.python-versions-list-as-string }} branch: ${{ needs.build-info.outputs.default-branch }} canary-run: ${{ needs.build-info.outputs.canary-run }} @@ -304,6 +306,7 @@ jobs: ci-image-build: ${{ needs.build-info.outputs.ci-image-build }} include-success-outputs: ${{ needs.build-info.outputs.include-success-outputs }} debug-resources: ${{ needs.build-info.outputs.debug-resources }} + docs-build: ${{ needs.build-info.outputs.docs-build }} providers: name: "Provider checks" @@ -319,6 +322,7 @@ jobs: with: runs-on-as-json-default: ${{ needs.build-info.outputs.runs-on-as-json-default }} image-tag: ${{ needs.build-info.outputs.image-tag }} + canary-run: ${{ needs.build-info.outputs.canary-run }} default-python-version: ${{ needs.build-info.outputs.default-python-version }} upgrade-to-newer-dependencies: ${{ needs.build-info.outputs.upgrade-to-newer-dependencies }} affected-providers-list-as-string: ${{ needs.build-info.outputs.affected-providers-list-as-string }} @@ -541,7 +545,7 @@ jobs: default-python-version: ${{ needs.build-info.outputs.default-python-version }} branch: ${{ needs.build-info.outputs.default-branch }} push-image: "true" - use-uv: "true" + use-uv: ${{ needs.build-info.outputs.force-pip && 'false' || 'true' }} build-provider-packages: ${{ needs.build-info.outputs.default-branch == 'main' }} upgrade-to-newer-dependencies: ${{ needs.build-info.outputs.upgrade-to-newer-dependencies }} chicken-egg-providers: ${{ needs.build-info.outputs.chicken-egg-providers }} diff --git a/.github/workflows/finalize-tests.yml b/.github/workflows/finalize-tests.yml index 8b392ba204664..a460dbe151a30 100644 --- a/.github/workflows/finalize-tests.yml +++ b/.github/workflows/finalize-tests.yml @@ -145,7 +145,7 @@ jobs: python-versions: ${{ inputs.python-versions }} branch: ${{ inputs.branch }} constraints-branch: ${{ inputs.constraints-branch }} - use-uv: "true" + use-uv: ${{ needs.build-info.outputs.force-pip && 'false' || 'true' }} include-success-outputs: ${{ inputs.include-success-outputs }} docker-cache: ${{ inputs.docker-cache }} if: inputs.canary-run == 'true' diff --git a/.github/workflows/generate-constraints.yml b/.github/workflows/generate-constraints.yml index 207fd4339c8db..d6e536dfd091a 100644 --- a/.github/workflows/generate-constraints.yml +++ b/.github/workflows/generate-constraints.yml @@ -95,7 +95,7 @@ jobs: timeout-minutes: 25 run: > breeze release-management generate-constraints --run-in-parallel - --airflow-constraints-mode constraints-no-providers --answer yes + --airflow-constraints-mode constraints-no-providers --answer yes --parallelism 3 # The no providers constraints are only needed when we want to update constraints (in canary builds) # They slow down the start of PROD image builds so we want to only run them when needed. if: inputs.generate-no-providers-constraints == 'true' @@ -115,7 +115,7 @@ jobs: run: > breeze release-management generate-constraints --run-in-parallel --airflow-constraints-mode constraints --answer yes - --chicken-egg-providers "${{ inputs.chicken-egg-providers }}" + --chicken-egg-providers "${{ inputs.chicken-egg-providers }}" --parallelism 3 - name: "Dependency upgrade summary" shell: bash run: | diff --git a/.github/workflows/integration-tests.yml b/.github/workflows/integration-tests.yml index e831350f5b186..530d0f9fc5636 100644 --- a/.github/workflows/integration-tests.yml +++ b/.github/workflows/integration-tests.yml @@ -59,6 +59,7 @@ on: # yamllint disable-line rule:truthy jobs: tests-integration: timeout-minutes: 130 + if: inputs.testable-integrations != '[]' name: "Integration Tests: ${{ matrix.integration }}" runs-on: ${{ fromJSON(inputs.runs-on-as-json-public) }} strategy: diff --git a/.github/workflows/k8s-tests.yml b/.github/workflows/k8s-tests.yml index 9a764e88c4e99..3b3e067038db9 100644 --- a/.github/workflows/k8s-tests.yml +++ b/.github/workflows/k8s-tests.yml @@ -101,7 +101,7 @@ jobs: k8s-env-${{ steps.breeze.outputs.host-python-version }}-\ ${{ hashFiles('scripts/ci/kubernetes/k8s_requirements.txt','hatch_build.py') }}" - name: "Switch breeze to use uv" - run: breeze setup-config --use-uv + run: breeze setup config --use-uv if: inputs.use-uv == 'true' - name: Run complete K8S tests ${{ inputs.kubernetes-combos-list-as-string }} run: breeze k8s run-complete-tests --run-in-parallel --upgrade --no-copy-local-sources diff --git a/.github/workflows/static-checks-mypy-docs.yml b/.github/workflows/static-checks-mypy-docs.yml index 9a1e4ac4ac7f9..be2c4f8e28645 100644 --- a/.github/workflows/static-checks-mypy-docs.yml +++ b/.github/workflows/static-checks-mypy-docs.yml @@ -36,7 +36,7 @@ on: # yamllint disable-line rule:truthy description: "Whether to run mypy checks (true/false)" required: true type: string - mypy-folders: + mypy-checks: description: "List of folders to run mypy checks on" required: false type: string @@ -92,6 +92,10 @@ on: # yamllint disable-line rule:truthy description: "Whether to debug resources (true/false)" required: true type: string + docs-build: + description: "Whether to build docs (true/false)" + required: true + type: string jobs: static-checks: timeout-minutes: 45 @@ -122,14 +126,11 @@ jobs: - name: "Prepare breeze & CI image: ${{ inputs.default-python-version}}:${{ inputs.image-tag }}" uses: ./.github/actions/prepare_breeze_and_image id: breeze - - name: Cache pre-commit envs - uses: actions/cache@v4 + - name: "Install pre-commit" + uses: ./.github/actions/install-pre-commit + id: pre-commit with: - path: ~/.cache/pre-commit - # yamllint disable-line rule:line-length - key: "pre-commit-${{steps.breeze.outputs.host-python-version}}-${{ hashFiles('.pre-commit-config.yaml') }}" - restore-keys: | - pre-commit-${{steps.breeze.outputs.host-python-version}}- + python-version: ${{steps.breeze.outputs.host-python-version}} - name: "Static checks" run: breeze static-checks --all-files --show-diff-on-failure --color always --initialize-environment env: @@ -148,7 +149,7 @@ jobs: strategy: fail-fast: false matrix: - mypy-folder: ${{ fromJSON(inputs.mypy-folders) }} + mypy-check: ${{ fromJSON(inputs.mypy-checks) }} env: PYTHON_MAJOR_MINOR_VERSION: "${{inputs.default-python-version}}" IMAGE_TAG: "${{ inputs.image-tag }}" @@ -166,10 +167,13 @@ jobs: - name: "Prepare breeze & CI image: ${{ inputs.default-python-version }}:${{ inputs.image-tag }}" uses: ./.github/actions/prepare_breeze_and_image id: breeze - - name: "MyPy checks for ${{ matrix.mypy-folder }}" - run: | - pip install pre-commit - pre-commit run --color always --verbose --hook-stage manual mypy-${{matrix.mypy-folder}} --all-files + - name: "Install pre-commit" + uses: ./.github/actions/install-pre-commit + id: pre-commit + with: + python-version: ${{steps.breeze.outputs.host-python-version}} + - name: "MyPy checks for ${{ matrix.mypy-check }}" + run: pre-commit run --color always --verbose --hook-stage manual ${{matrix.mypy-check}} --all-files env: VERBOSE: "false" COLUMNS: "250" @@ -182,6 +186,7 @@ jobs: timeout-minutes: 150 name: "Build documentation" runs-on: ${{ fromJSON(inputs.runs-on-as-json-default) }} + if: inputs.docs-build == 'true' strategy: fail-fast: false matrix: @@ -231,8 +236,6 @@ jobs: timeout-minutes: 150 name: "Publish documentation" needs: build-docs - # For canary runs we need to push documentation to AWS S3 and preparing it takes a lot of space - # So we should use self-hosted ASF runners for this runs-on: ${{ fromJSON(inputs.runs-on-as-json-docs-build) }} env: GITHUB_REPOSITORY: ${{ github.repository }} @@ -259,16 +262,22 @@ jobs: with: name: airflow-docs path: './docs/_build' + - name: Check disk space available + run: df -h + - name: Create /mnt/airflow-site directory + run: sudo mkdir -p /mnt/airflow-site && sudo chown -R "${USER}" /mnt/airflow-site - name: "Clone airflow-site" run: > - git clone https://github.com/apache/airflow-site.git ${GITHUB_WORKSPACE}/airflow-site && - echo "AIRFLOW_SITE_DIRECTORY=${GITHUB_WORKSPACE}/airflow-site" >> "$GITHUB_ENV" + git clone https://github.com/apache/airflow-site.git /mnt/airflow-site/airflow-site && + echo "AIRFLOW_SITE_DIRECTORY=/mnt/airflow-site/airflow-site" >> "$GITHUB_ENV" - name: "Prepare breeze & CI image: ${{ inputs.default-python-version }}:${{ inputs.image-tag }}" uses: ./.github/actions/prepare_breeze_and_image - name: "Publish docs" run: > breeze release-management publish-docs --override-versioned --run-in-parallel ${{ inputs.docs-list-as-string }} + - name: Check disk space available + run: df -h - name: "Generate back references for providers" run: breeze release-management add-back-references all-providers - name: "Generate back references for apache-airflow" diff --git a/.gitignore b/.gitignore index 0c94718749f57..5a1a0446b74d6 100644 --- a/.gitignore +++ b/.gitignore @@ -247,7 +247,5 @@ licenses/LICENSES-ui.txt # airflow-build-dockerfile and correconding ignore file airflow-build-dockerfile* -# Airflow 3 files -# These directories are ignored so someone can develop on both of them without deleting files manually -airflow/ui -task_sdk +# Temporary ignore uv.lock until we integrate it fully in our constraint preparation mechanism +/uv.lock diff --git a/contributing-docs/03_contributors_quick_start.rst b/contributing-docs/03_contributors_quick_start.rst index eb84bb668a78b..e663cf2ed19f1 100644 --- a/contributing-docs/03_contributors_quick_start.rst +++ b/contributing-docs/03_contributors_quick_start.rst @@ -451,6 +451,39 @@ tests are applied when you commit your code. To avoid burden on CI infrastructure and to save time, Pre-commit hooks can be run locally before committing changes. +.. note:: + + We have recently started to recommend ``uv`` for our local development. Currently (October 2024) ``uv`` + speeds up installation more than 10x comparing to ``pip``. While we still describe ``pip`` and ``pipx`` + below, we also show the ``uv`` alternatives. + +.. note:: + + Remember to have global python set to Python >= 3.9 - Python 3.8 is end-of-life already and we've + started to use Python 3.9+ features in Airflow and accompanying scripts. + + +Installing pre-commit is best done with ``pipx``: + +.. code-block:: bash + + pipx install pre-commit + +You can still add uv support for pre-commit if you use pipx using the commands: + +.. code-block:: bash + + pipx install pre-commit + pipx inject + pipx inject pre-commit pre-commit-uv + +Also, if you already use ``uvx`` instead of ``pipx``, use this command: + +.. code-block:: bash + + uv tool install pre-commit --with pre-commit-uv --force-reinstall + + 1. Installing required packages on Debian / Ubuntu, install via diff --git a/dev/breeze/doc/01_installation.rst b/dev/breeze/doc/01_installation.rst index 6ff68d2bb6455..aad8640e7f60c 100644 --- a/dev/breeze/doc/01_installation.rst +++ b/dev/breeze/doc/01_installation.rst @@ -151,13 +151,28 @@ Docker in WSL 2 If VS Code is installed on the Windows host system then in the WSL Linux Distro you can run ``code .`` in the root directory of you Airflow repo to launch VS Code. -The pipx tool --------------- +The uv tool +----------- + +We are recommending to use the ``uv`` tool to manage your virtual environments and generally as a swiss-knife +of your Python environment (it supports installing various versions of Python, creating virtual environments, +installing packages, managing workspaces and running development tools.). + +Installing ``uv`` is described in the `uv documentation `_. +We highly recommend using ``uv`` to manage your Python environments, as it is very comprehensive, +easy to use, it is faster than any of the other tools availables (way faster!) and has a lot of features +that make it easier to work with Python. + +Alternative: pipx tool +---------------------- -We are using ``pipx`` tool to install and manage Breeze. The ``pipx`` tool is created by the creators +However, we do not want to be entirely dependent on ``uv`` as it is a software governed by a VC-backed vendor, +so we always want to provide open-source governed alternatives for our tools. If you can't or do not want to +use ``uv``, we got you covered. Another too you can use to manage development tools (and ``breeze`` development +environment is Python-Software-Foundation managed ``pipx``. The ``pipx`` tool is created by the creators of ``pip`` from `Python Packaging Authority `_ -Note that ``pipx`` >= 1.4.1 is used. +Note that ``pipx`` >= 1.4.1 should be used. Install pipx @@ -172,7 +187,7 @@ environments. This can be done automatically by the following command (follow in pipx ensurepath -In Mac +In case ``pipx`` is not in your PATH, you can run it with Python module: .. code-block:: bash diff --git a/dev/breeze/doc/ci/04_selective_checks.md b/dev/breeze/doc/ci/04_selective_checks.md index 819633d4c59ee..3f8d8a97fae03 100644 --- a/dev/breeze/doc/ci/04_selective_checks.md +++ b/dev/breeze/doc/ci/04_selective_checks.md @@ -201,7 +201,7 @@ Github Actions to pass the list of parameters to a command to execute | kubernetes-combos-list-as-string | All combinations of Python version and Kubernetes version to use for tests as space-separated string | 3.8-v1.25.2 3.9-v1.26.4 | * | | kubernetes-versions | All Kubernetes versions to use for tests as JSON array | ['v1.25.2'] | | | kubernetes-versions-list-as-string | All Kubernetes versions to use for tests as space-separated string | v1.25.2 | * | -| mypy-folders | List of folders to be considered for mypy | [] | | +| mypy-checks | List of folders to be considered for mypy | [] | | | mysql-exclude | Which versions of MySQL to exclude for tests as JSON array | [] | | | mysql-versions | Which versions of MySQL to use for tests as JSON array | ['5.7'] | | | needs-api-codegen | Whether "api-codegen" are needed to run ("true"/"false") | true | | diff --git a/dev/breeze/src/airflow_breeze/commands/developer_commands.py b/dev/breeze/src/airflow_breeze/commands/developer_commands.py index 91f5e08060eb7..a32faa7c9ad15 100644 --- a/dev/breeze/src/airflow_breeze/commands/developer_commands.py +++ b/dev/breeze/src/airflow_breeze/commands/developer_commands.py @@ -836,7 +836,7 @@ def static_checks( for attempt in range(1, 1 + max_initialization_attempts): get_console().print(f"[info]Attempt number {attempt} to install pre-commit environments") initialization_result = run_command( - [sys.executable, "-m", "pre_commit", "install", "--install-hooks"], + ["pre-commit", "install", "--install-hooks"], check=False, no_output_dump_on_exception=True, text=True, @@ -849,7 +849,7 @@ def static_checks( get_console().print("[error]Could not install pre-commit environments[/]") sys.exit(return_code) - command_to_execute = [sys.executable, "-m", "pre_commit", "run"] + command_to_execute = ["pre-commit", "run"] if not one_or_none_set([last_commit, commit_ref, only_my_changes, all_files]): get_console().print( "\n[error]You can only specify " diff --git a/dev/breeze/src/airflow_breeze/commands/release_management_commands.py b/dev/breeze/src/airflow_breeze/commands/release_management_commands.py index 648a13d424658..ce9ec44e57960 100644 --- a/dev/breeze/src/airflow_breeze/commands/release_management_commands.py +++ b/dev/breeze/src/airflow_breeze/commands/release_management_commands.py @@ -229,13 +229,14 @@ class VersionedFile(NamedTuple): AIRFLOW_PIP_VERSION = "24.3.1" AIRFLOW_UV_VERSION = "0.4.29" AIRFLOW_USE_UV = False -WHEEL_VERSION = "0.36.2" -GITPYTHON_VERSION = "3.1.40" -RICH_VERSION = "13.7.0" -NODE_VERSION = "21.2.0" +# TODO: automate thsese as well +WHEEL_VERSION = "0.44.0" +GITPYTHON_VERSION = "3.1.43" +RICH_VERSION = "13.9.4" +NODE_VERSION = "22.2.0" PRE_COMMIT_VERSION = "3.5.0" -HATCH_VERSION = "1.9.1" -PYYAML_VERSION = "6.0.1" +HATCH_VERSION = "1.13.0" +PYYAML_VERSION = "6.0.2" AIRFLOW_BUILD_DOCKERFILE = f""" FROM python:{DEFAULT_PYTHON_MAJOR_MINOR_VERSION}-slim-{ALLOWED_DEBIAN_VERSIONS[0]} diff --git a/dev/breeze/src/airflow_breeze/utils/kubernetes_utils.py b/dev/breeze/src/airflow_breeze/utils/kubernetes_utils.py index 3aca9d51c130c..b9bdc5302bdfc 100644 --- a/dev/breeze/src/airflow_breeze/utils/kubernetes_utils.py +++ b/dev/breeze/src/airflow_breeze/utils/kubernetes_utils.py @@ -391,10 +391,7 @@ def create_virtualenv(force_venv_setup: bool) -> RunCommandResult: "[info]You can uninstall breeze and install it again with earlier Python " "version. For example:[/]\n" ) - get_console().print("pipx reinstall --python PYTHON_PATH apache-airflow-breeze\n") - get_console().print( - f"[info]PYTHON_PATH - path to your Python binary(< {higher_python_version_tuple})[/]\n" - ) + get_console().print("[info]Then recreate your k8s virtualenv with:[/]\n") get_console().print("breeze k8s setup-env --force-venv-setup\n") sys.exit(1) diff --git a/dev/breeze/src/airflow_breeze/utils/path_utils.py b/dev/breeze/src/airflow_breeze/utils/path_utils.py index 8c3c7814bb351..b86cb837cbe46 100644 --- a/dev/breeze/src/airflow_breeze/utils/path_utils.py +++ b/dev/breeze/src/airflow_breeze/utils/path_utils.py @@ -167,8 +167,9 @@ def reinstall_if_setup_changed() -> bool: return False if "apache-airflow-breeze" in e.msg: print( - """Missing Package `apache-airflow-breeze`. - Use `pipx install -e ./dev/breeze` to install the package.""" + """Missing Package `apache-airflow-breeze`. Please install it.\n + Use `uv tool install -e ./dev/breeze or `pipx install -e ./dev/breeze` + to install the package.""" ) return False sources_hash = get_installation_sources_config_metadata_hash() @@ -224,10 +225,10 @@ def get_used_airflow_sources() -> Path: @lru_cache(maxsize=None) def find_airflow_sources_root_to_operate_on() -> Path: """ - Find the root of airflow sources we operate on. Handle the case when Breeze is installed via `pipx` from - a different source tree, so it searches upwards of the current directory to find the right root of - airflow directory we are actually in. This **might** be different than the sources of Airflow Breeze - was installed from. + Find the root of airflow sources we operate on. Handle the case when Breeze is installed via + `pipx` or `uv tool` from a different source tree, so it searches upwards of the current directory + to find the right root of airflow directory we are actually in. This **might** be different + than the sources of Airflow Breeze was installed from. If not found, we operate on Airflow sources that we were installed it. This handles the case when we run Breeze from a "random" directory. diff --git a/dev/breeze/src/airflow_breeze/utils/python_versions.py b/dev/breeze/src/airflow_breeze/utils/python_versions.py index 3571bebb245dc..b8807e66bf87c 100644 --- a/dev/breeze/src/airflow_breeze/utils/python_versions.py +++ b/dev/breeze/src/airflow_breeze/utils/python_versions.py @@ -56,8 +56,9 @@ def check_python_version(): if error: get_console().print( "[warning]Please reinstall Breeze using Python 3.9 - 3.11 environment.[/]\n\n" - "For example:\n\n" - "pipx uninstall apache-airflow-breeze\n" - "pipx install --python $(which python3.9) -e ./dev/breeze --force\n" + "If you are using uv:\n\n" + " uv tool install --force --reinstall --python 3.9 -e ./dev/breeze\n\n" + "If you are using pipx:\n\n" + " pipx install --python $(which python3.9) --force -e ./dev/breeze\n" ) sys.exit(1) diff --git a/dev/breeze/src/airflow_breeze/utils/reinstall.py b/dev/breeze/src/airflow_breeze/utils/reinstall.py index de3da92855430..6165c8a307201 100644 --- a/dev/breeze/src/airflow_breeze/utils/reinstall.py +++ b/dev/breeze/src/airflow_breeze/utils/reinstall.py @@ -27,15 +27,24 @@ def reinstall_breeze(breeze_sources: Path, re_run: bool = True): """ - Reinstalls Breeze from specified sources. + Re-installs Breeze from specified sources. :param breeze_sources: Sources where to install Breeze from. :param re_run: whether to re-run the original command that breeze was run with. """ + # First check if `breeze` is installed with uv and if it is, reinstall it using uv + # If not - we assume pipx is used and we reinstall it using pipx # Note that we cannot use `pipx upgrade` here because we sometimes install # Breeze from different sources than originally installed (i.e. when we reinstall airflow # From the current directory. get_console().print(f"\n[info]Reinstalling Breeze from {breeze_sources}\n") - subprocess.check_call(["pipx", "install", "-e", str(breeze_sources), "--force"]) + result = subprocess.run(["uv", "tool", "list"], text=True, capture_output=True, check=False) + if result.returncode == 0: + if "apache-airflow-breeze" in result.stdout: + subprocess.check_call( + ["uv", "tool", "install", "--force", "--reinstall", "-e", breeze_sources.as_posix()] + ) + else: + subprocess.check_call(["pipx", "install", "-e", breeze_sources.as_posix(), "--force"]) if re_run: # Make sure we don't loop forever if the metadata hash hasn't been updated yet (else it is tricky to # run pre-commit checks via breeze!) diff --git a/dev/breeze/src/airflow_breeze/utils/run_utils.py b/dev/breeze/src/airflow_breeze/utils/run_utils.py index 6c72f85671cf5..f98eedad937a3 100644 --- a/dev/breeze/src/airflow_breeze/utils/run_utils.py +++ b/dev/breeze/src/airflow_breeze/utils/run_utils.py @@ -212,14 +212,14 @@ def assert_pre_commit_installed(): python_executable = sys.executable get_console().print(f"[info]Checking pre-commit installed for {python_executable}[/]") command_result = run_command( - [python_executable, "-m", "pre_commit", "--version"], + ["pre-commit", "--version"], capture_output=True, text=True, check=False, ) if command_result.returncode == 0: if command_result.stdout: - pre_commit_version = command_result.stdout.split(" ")[-1].strip() + pre_commit_version = command_result.stdout.split(" ")[1].strip() if Version(pre_commit_version) >= Version(min_pre_commit_version): get_console().print( f"\n[success]Package pre_commit is installed. " @@ -231,6 +231,20 @@ def assert_pre_commit_installed(): f"aat least {min_pre_commit_version} and is {pre_commit_version}.[/]\n\n" ) sys.exit(1) + if "pre-commit-uv" not in command_result.stdout: + get_console().print( + "\n[warning]You can significantly improve speed of installing your pre-commit envs " + "by installing `pre-commit-uv` with it.[/]\n" + ) + get_console().print( + "\n[warning]With uv you can install it with:[/]\n\n" + " uv tool install pre-commit --with pre-commit-uv --force-reinstall\n" + ) + get_console().print( + "\n[warning]With pipx you can install it with:[/]\n\n" + " pipx inject\n" + " pipx inject pre-commit pre-commit-uv\n" + ) else: get_console().print( "\n[warning]Could not determine version of pre-commit. You might need to update it![/]\n" @@ -450,9 +464,7 @@ def run_compile_www_assets( "[info]However, it requires you to have local yarn installation.\n" ) command_to_execute = [ - sys.executable, - "-m", - "pre_commit", + "pre-commit", "run", "--hook-stage", "manual", diff --git a/dev/breeze/src/airflow_breeze/utils/selective_checks.py b/dev/breeze/src/airflow_breeze/utils/selective_checks.py index ee077aefe24d2..f09c74579191f 100644 --- a/dev/breeze/src/airflow_breeze/utils/selective_checks.py +++ b/dev/breeze/src/airflow_breeze/utils/selective_checks.py @@ -73,6 +73,7 @@ DEBUG_CI_RESOURCES_LABEL = "debug ci resources" DEFAULT_VERSIONS_ONLY_LABEL = "default versions only" DISABLE_IMAGE_CACHE_LABEL = "disable image cache" +FORCE_PIP_LABEL = "force pip" FULL_TESTS_NEEDED_LABEL = "full tests needed" INCLUDE_SUCCESS_OUTPUTS_LABEL = "include success outputs" LATEST_VERSIONS_ONLY_LABEL = "latest versions only" @@ -614,41 +615,41 @@ def _should_be_run(self, source_area: FileGroupForCi) -> bool: return False @cached_property - def mypy_folders(self) -> list[str]: - folders_to_check: list[str] = [] + def mypy_checks(self) -> list[str]: + checks_to_run: list[str] = [] if ( self._matching_files( FileGroupForCi.ALL_AIRFLOW_PYTHON_FILES, CI_FILE_GROUP_MATCHES, CI_FILE_GROUP_EXCLUDES ) or self.full_tests_needed ): - folders_to_check.append("airflow") + checks_to_run.append("mypy-airflow") if ( self._matching_files( FileGroupForCi.ALL_PROVIDERS_PYTHON_FILES, CI_FILE_GROUP_MATCHES, CI_FILE_GROUP_EXCLUDES ) or self._are_all_providers_affected() ) and self._default_branch == "main": - folders_to_check.append("providers") + checks_to_run.append("mypy-providers") if ( self._matching_files( FileGroupForCi.ALL_DOCS_PYTHON_FILES, CI_FILE_GROUP_MATCHES, CI_FILE_GROUP_EXCLUDES ) or self.full_tests_needed ): - folders_to_check.append("docs") + checks_to_run.append("mypy-docs") if ( self._matching_files( FileGroupForCi.ALL_DEV_PYTHON_FILES, CI_FILE_GROUP_MATCHES, CI_FILE_GROUP_EXCLUDES ) or self.full_tests_needed ): - folders_to_check.append("dev") - return folders_to_check + checks_to_run.append("mypy-dev") + return checks_to_run @cached_property def needs_mypy(self) -> bool: - return self.mypy_folders != [] + return self.mypy_checks != [] @cached_property def needs_python_scans(self) -> bool: @@ -697,7 +698,16 @@ def run_tests(self) -> bool: @cached_property def ci_image_build(self) -> bool: - return self.run_tests or self.docs_build or self.run_kubernetes_tests or self.needs_helm_tests + # in case pyproject.toml changed, CI image should be built - even if no build dependencies + # changes because some of our tests - those that need CI image might need to be run depending on + # changed rules for static checks that are part of the pyproject.toml file + return ( + self.run_tests + or self.docs_build + or self.run_kubernetes_tests + or self.needs_helm_tests + or self.pyproject_toml_changed + ) @cached_property def prod_image_build(self) -> bool: @@ -861,7 +871,9 @@ def separate_test_types_list_as_string(self) -> str | None: current_test_types = set(self._get_test_types_to_run(split_to_individual_providers=True)) if "Providers" in current_test_types: current_test_types.remove("Providers") - current_test_types.update({f"Providers[{provider}]" for provider in get_available_packages()}) + current_test_types.update( + {f"Providers[{provider}]" for provider in get_available_packages(include_not_ready=True)} + ) if self.skip_provider_tests: current_test_types = { test_type for test_type in current_test_types if not test_type.startswith("Providers") @@ -896,6 +908,8 @@ def pyproject_toml_changed(self) -> bool: if not self._commit_ref: get_console().print("[warning]Cannot determine pyproject.toml changes as commit is missing[/]") return False + if "pyproject.toml" not in self._files: + return False new_result = run_command( ["git", "show", f"{self._commit_ref}:pyproject.toml"], capture_output=True, @@ -1154,10 +1168,11 @@ def runs_on_as_json_self_hosted_asf(self) -> str: @cached_property def runs_on_as_json_docs_build(self) -> str: - if self._is_canary_run(): - return RUNS_ON_SELF_HOSTED_ASF_RUNNER - else: - return RUNS_ON_PUBLIC_RUNNER + # We used to run docs build on self-hosted runners because they had more space, but + # It turned out that public runners have a lot of space in /mnt folder that we can utilise + # but in the future we might want to switch back to self-hosted runners so we have this + # separate property to determine that and place to implement different logic if needed + return RUNS_ON_PUBLIC_RUNNER @cached_property def runs_on_as_json_public(self) -> str: diff --git a/dev/breeze/tests/test_selective_checks.py b/dev/breeze/tests/test_selective_checks.py index 6bee6bbc7e308..551374bd35502 100644 --- a/dev/breeze/tests/test_selective_checks.py +++ b/dev/breeze/tests/test_selective_checks.py @@ -125,7 +125,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "providers-test-types-list-as-string": None, "separate-test-types-list-as-string": None, "needs-mypy": "false", - "mypy-folders": "[]", + "mypy-checks": "[]", }, id="No tests on simple change", ) @@ -152,7 +152,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "providers-test-types-list-as-string": "Providers[fab]", "separate-test-types-list-as-string": "API Always Providers[fab]", "needs-mypy": "true", - "mypy-folders": "['airflow']", + "mypy-checks": "['mypy-airflow']", }, id="Only API tests and DOCS and FAB provider should run", ) @@ -177,7 +177,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "parallel-test-types-list-as-string": "API Always", "separate-test-types-list-as-string": "API Always", "needs-mypy": "true", - "mypy-folders": "['airflow']", + "mypy-checks": "['mypy-airflow']", }, id="Only API tests and DOCS should run (no provider tests) when only internal_api changed", ) @@ -202,7 +202,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "parallel-test-types-list-as-string": "API Always", "separate-test-types-list-as-string": "API Always", "needs-mypy": "true", - "mypy-folders": "['airflow']", + "mypy-checks": "['mypy-airflow']", }, id="Only API tests should run (no provider tests) and no DOCs build when only test API files changed", ) @@ -229,7 +229,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "providers-test-types-list-as-string": "", "separate-test-types-list-as-string": "Always Operators", "needs-mypy": "true", - "mypy-folders": "['airflow']", + "mypy-checks": "['mypy-airflow']", }, id="Only Operator tests and DOCS should run", ) @@ -258,7 +258,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "separate-test-types-list-as-string": "Always BranchExternalPython BranchPythonVenv " "ExternalPython Operators PythonVenv", "needs-mypy": "true", - "mypy-folders": "['airflow']", + "mypy-checks": "['mypy-airflow']", }, id="Only Python tests", ) @@ -285,7 +285,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "providers-test-types-list-as-string": "", "separate-test-types-list-as-string": "Always Serialization", "needs-mypy": "true", - "mypy-folders": "['airflow']", + "mypy-checks": "['mypy-airflow']", }, id="Only Serialization tests", ) @@ -320,7 +320,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "Providers[fab] Providers[google] Providers[openlineage] Providers[pgvector] " "Providers[postgres]", "needs-mypy": "true", - "mypy-folders": "['airflow', 'providers']", + "mypy-checks": "['mypy-airflow', 'mypy-providers']", }, id="API and providers tests and docs should run", ) @@ -348,7 +348,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "providers-test-types-list-as-string": "Providers[apache.beam] Providers[google]", "separate-test-types-list-as-string": "Always Providers[apache.beam] Providers[google]", "needs-mypy": "true", - "mypy-folders": "['providers']", + "mypy-checks": "['mypy-providers']", }, id="Selected Providers and docs should run", ) @@ -375,7 +375,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "parallel-test-types-list-as-string": None, "providers-test-types-list-as-string": None, "needs-mypy": "false", - "mypy-folders": "[]", + "mypy-checks": "[]", }, id="Only docs builds should run - no tests needed", ) @@ -407,7 +407,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "providers-test-types-list-as-string": "Providers[amazon] " "Providers[common.sql,openlineage,pgvector,postgres] Providers[google]", "needs-mypy": "true", - "mypy-folders": "['providers']", + "mypy-checks": "['mypy-providers']", }, id="Helm tests, providers (both upstream and downstream)," "kubernetes tests and docs should run", @@ -443,7 +443,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "Providers[apache.livy] Providers[dbt.cloud] " "Providers[dingding] Providers[discord] Providers[http]", "needs-mypy": "true", - "mypy-folders": "['providers']", + "mypy-checks": "['mypy-providers']", }, id="Helm tests, http and all relevant providers, kubernetes tests and " "docs should run even if unimportant files were added", @@ -474,7 +474,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "parallel-test-types-list-as-string": "Always Providers[airbyte,http]", "providers-test-types-list-as-string": "Providers[airbyte,http]", "needs-mypy": "true", - "mypy-folders": "['providers']", + "mypy-checks": "['mypy-providers']", }, id="Helm tests, airbyte/http providers, kubernetes tests and " "docs should run even if unimportant files were added", @@ -506,7 +506,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "parallel-test-types-list-as-string": "Always", "providers-test-types-list-as-string": "", "needs-mypy": "true", - "mypy-folders": "['airflow']", + "mypy-checks": "['mypy-airflow']", }, id="Docs should run even if unimportant files were added and prod image " "should be build for chart changes", @@ -533,7 +533,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, "providers-test-types-list-as-string": ALL_PROVIDERS_SELECTIVE_TEST_TYPES, "needs-mypy": "true", - "mypy-folders": "['airflow', 'providers', 'docs', 'dev']", + "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", }, id="Everything should run - including all providers and upgrading to " "newer requirements as pyproject.toml changed and all Python versions", @@ -560,7 +560,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, "providers-test-types-list-as-string": ALL_PROVIDERS_SELECTIVE_TEST_TYPES, "needs-mypy": "true", - "mypy-folders": "['airflow', 'providers', 'docs', 'dev']", + "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", }, id="Everything should run and upgrading to newer requirements as dependencies change", ) @@ -588,7 +588,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "Providers[apache.hive,cncf.kubernetes,common.compat,common.sql,exasol,ftp,http," "imap,microsoft.azure,mongo,mysql,openlineage,postgres,salesforce,ssh,teradata] Providers[google]", "needs-mypy": "true", - "mypy-folders": "['providers']", + "mypy-checks": "['mypy-providers']", }, id="Providers tests run including amazon tests if amazon provider files changed", ), @@ -611,7 +611,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "upgrade-to-newer-dependencies": "false", "parallel-test-types-list-as-string": "Always Providers[airbyte,http]", "needs-mypy": "true", - "mypy-folders": "['providers']", + "mypy-checks": "['mypy-providers']", }, id="Providers tests run without amazon tests if no amazon file changed", ), @@ -638,7 +638,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "Providers[apache.hive,cncf.kubernetes,common.compat,common.sql,exasol,ftp,http," "imap,microsoft.azure,mongo,mysql,openlineage,postgres,salesforce,ssh,teradata] Providers[google]", "needs-mypy": "true", - "mypy-folders": "['providers']", + "mypy-checks": "['mypy-providers']", }, id="Providers tests run including amazon tests if amazon provider files changed", ), @@ -665,7 +665,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "upgrade-to-newer-dependencies": "false", "parallel-test-types-list-as-string": "Always Providers[common.compat,common.io,openlineage]", "needs-mypy": "true", - "mypy-folders": "['airflow', 'providers']", + "mypy-checks": "['mypy-airflow', 'mypy-providers']", }, id="Only Always and common providers tests should run when only common.io and tests/always changed", ), @@ -688,7 +688,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "upgrade-to-newer-dependencies": "false", "parallel-test-types-list-as-string": "Always Core Operators Serialization", "needs-mypy": "true", - "mypy-folders": "['airflow']", + "mypy-checks": "['mypy-airflow']", }, id="Force Core and Serialization tests to run when airflow bash.py changed", ), @@ -711,7 +711,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "upgrade-to-newer-dependencies": "false", "parallel-test-types-list-as-string": "Always Core Operators Serialization", "needs-mypy": "true", - "mypy-folders": "['airflow']", + "mypy-checks": "['mypy-airflow']", }, id="Force Core and Serialization tests to run when tests bash changed", ), @@ -736,7 +736,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, "providers-test-types-list-as-string": ALL_PROVIDERS_SELECTIVE_TEST_TYPES, "needs-mypy": "true", - "mypy-folders": "['airflow', 'providers', 'docs', 'dev']", + "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", }, id="All tests should be run when tests/utils/ change", ) @@ -893,7 +893,7 @@ def test_full_test_needed_when_scripts_changes(files: tuple[str, ...], expected_ "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, "providers-test-types-list-as-string": ALL_PROVIDERS_SELECTIVE_TEST_TYPES, "needs-mypy": "true", - "mypy-folders": "['airflow', 'providers', 'docs', 'dev']", + "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", }, id="Everything should run including all providers when full tests are needed, " "and all versions are required.", @@ -927,7 +927,7 @@ def test_full_test_needed_when_scripts_changes(files: tuple[str, ...], expected_ "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, "providers-test-types-list-as-string": ALL_PROVIDERS_SELECTIVE_TEST_TYPES, "needs-mypy": "true", - "mypy-folders": "['airflow', 'providers', 'docs', 'dev']", + "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", }, id="Everything should run including all providers when full tests are needed " "but with single python and kubernetes if `default versions only` label is set", @@ -961,7 +961,7 @@ def test_full_test_needed_when_scripts_changes(files: tuple[str, ...], expected_ "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, "providers-test-types-list-as-string": ALL_PROVIDERS_SELECTIVE_TEST_TYPES, "needs-mypy": "true", - "mypy-folders": "['airflow', 'providers', 'docs', 'dev']", + "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", }, id="Everything should run including all providers when full tests are needed " "but with single python and kubernetes if no version label is set", @@ -996,7 +996,7 @@ def test_full_test_needed_when_scripts_changes(files: tuple[str, ...], expected_ "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, "providers-test-types-list-as-string": ALL_PROVIDERS_SELECTIVE_TEST_TYPES, "needs-mypy": "true", - "mypy-folders": "['airflow', 'providers', 'docs', 'dev']", + "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", }, id="Everything should run including all providers when full tests are needed " "but with single python and kubernetes if `latest versions only` label is set", @@ -1031,7 +1031,7 @@ def test_full_test_needed_when_scripts_changes(files: tuple[str, ...], expected_ "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, "providers-test-types-list-as-string": ALL_PROVIDERS_SELECTIVE_TEST_TYPES, "needs-mypy": "true", - "mypy-folders": "['airflow', 'providers', 'docs', 'dev']", + "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", }, id="Everything should run including full providers when full " "tests are needed even with different label set as well", @@ -1067,7 +1067,7 @@ def test_full_test_needed_when_scripts_changes(files: tuple[str, ...], expected_ + LIST_OF_ALL_PROVIDER_TESTS + " PythonVenv Serialization WWW", "needs-mypy": "true", - "mypy-folders": "['airflow', 'providers', 'docs', 'dev']", + "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", }, id="Everything should run including full providers when" "full tests are needed even if no files are changed", @@ -1101,7 +1101,7 @@ def test_full_test_needed_when_scripts_changes(files: tuple[str, ...], expected_ "BranchPythonVenv CLI Core ExternalPython Operators Other PlainAsserts " "PythonVenv Serialization WWW", "needs-mypy": "true", - "mypy-folders": "['airflow', 'docs', 'dev']", + "mypy-checks": "['mypy-airflow', 'mypy-docs', 'mypy-dev']", }, id="Everything should run except Providers and lint pre-commit " "when full tests are needed for non-main branch", @@ -1144,7 +1144,7 @@ def test_expected_output_full_tests_needed( "skip-provider-tests": "true", "parallel-test-types-list-as-string": None, "needs-mypy": "false", - "mypy-folders": "[]", + "mypy-checks": "[]", }, id="Nothing should run if only non-important files changed", ), @@ -1171,7 +1171,7 @@ def test_expected_output_full_tests_needed( "skip-provider-tests": "true", "parallel-test-types-list-as-string": "Always", "needs-mypy": "false", - "mypy-folders": "[]", + "mypy-checks": "[]", }, id="No Helm tests, No providers no lint charts, should run if " "only chart/providers changed in non-main but PROD image should be built", @@ -1201,7 +1201,7 @@ def test_expected_output_full_tests_needed( "skip-provider-tests": "true", "parallel-test-types-list-as-string": "Always CLI", "needs-mypy": "true", - "mypy-folders": "['airflow']", + "mypy-checks": "['mypy-airflow']", }, id="Only CLI tests and Kubernetes tests should run if cli/chart files changed in non-main branch", ), @@ -1227,7 +1227,7 @@ def test_expected_output_full_tests_needed( "parallel-test-types-list-as-string": "API Always BranchExternalPython BranchPythonVenv " "CLI Core ExternalPython Operators Other PlainAsserts PythonVenv Serialization WWW", "needs-mypy": "true", - "mypy-folders": "['airflow']", + "mypy-checks": "['mypy-airflow']", }, id="All tests except Providers and helm lint pre-commit " "should run if core file changed in non-main branch", @@ -1268,7 +1268,7 @@ def test_expected_output_pull_request_v2_7( "skip-provider-tests": "true", "parallel-test-types-list-as-string": None, "needs-mypy": "false", - "mypy-folders": "[]", + "mypy-checks": "[]", }, id="Nothing should run if only non-important files changed", ), @@ -1289,7 +1289,7 @@ def test_expected_output_pull_request_v2_7( "skip-provider-tests": "true", "parallel-test-types-list-as-string": "Always", "needs-mypy": "true", - "mypy-folders": "['airflow']", + "mypy-checks": "['mypy-airflow']", }, id="Only Always and docs build should run if only system tests changed", ), @@ -1324,7 +1324,7 @@ def test_expected_output_pull_request_v2_7( "hashicorp,microsoft.azure,microsoft.mssql,mysql,openlineage,oracle,postgres,presto," "salesforce,samba,sftp,ssh,trino] Providers[google]", "needs-mypy": "true", - "mypy-folders": "['airflow', 'providers']", + "mypy-checks": "['mypy-airflow', 'mypy-providers']", }, id="CLI tests and Google-related provider tests should run if cli/chart files changed but " "prod image should be build too and k8s tests too", @@ -1352,7 +1352,7 @@ def test_expected_output_pull_request_v2_7( "skip-provider-tests": "false", "parallel-test-types-list-as-string": "API Always CLI Operators Providers[fab] WWW", "needs-mypy": "true", - "mypy-folders": "['airflow']", + "mypy-checks": "['mypy-airflow']", }, id="No providers tests except fab should run if only CLI/API/Operators/WWW file changed", ), @@ -1373,7 +1373,7 @@ def test_expected_output_pull_request_v2_7( "skip-provider-tests": "true", "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES_WITHOUT_PROVIDERS, "needs-mypy": "true", - "mypy-folders": "['airflow']", + "mypy-checks": "['mypy-airflow']", }, id="Tests for all airflow core types except providers should run if model file changed", ), @@ -1394,7 +1394,7 @@ def test_expected_output_pull_request_v2_7( "skip-provider-tests": "true", "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES_WITHOUT_PROVIDERS, "needs-mypy": "true", - "mypy-folders": "['airflow']", + "mypy-checks": "['mypy-airflow']", }, id="Tests for all airflow core types except providers should run if " "any other than API/WWW/CLI/Operators file changed.", @@ -1436,7 +1436,7 @@ def test_expected_output_pull_request_target( "upgrade-to-newer-dependencies": "true", "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, "needs-mypy": "true", - "mypy-folders": "['airflow', 'providers', 'docs', 'dev']", + "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", }, id="All tests run on push even if unimportant file changed", ), @@ -1459,7 +1459,7 @@ def test_expected_output_pull_request_target( "parallel-test-types-list-as-string": "API Always BranchExternalPython BranchPythonVenv " "CLI Core ExternalPython Operators Other PlainAsserts PythonVenv Serialization WWW", "needs-mypy": "true", - "mypy-folders": "['airflow', 'docs', 'dev']", + "mypy-checks": "['mypy-airflow', 'mypy-docs', 'mypy-dev']", }, id="All tests except Providers and Helm run on push" " even if unimportant file changed in non-main branch", @@ -1482,7 +1482,7 @@ def test_expected_output_pull_request_target( "upgrade-to-newer-dependencies": "true", "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, "needs-mypy": "true", - "mypy-folders": "['airflow', 'providers', 'docs', 'dev']", + "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", }, id="All tests run on push if core file changed", ), @@ -1537,7 +1537,7 @@ def test_no_commit_provided_trigger_full_build_for_any_event_type(github_event): else "false", "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, "needs-mypy": "true", - "mypy-folders": "['airflow', 'providers', 'docs', 'dev']", + "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", }, str(stderr), ) @@ -1763,7 +1763,7 @@ def test_helm_tests_trigger_ci_build(files: tuple[str, ...], expected_outputs: d dict(), # TODO: revert it when we fix self-hosted runners '["ubuntu-22.04"]', - '["self-hosted", "asf-runner"]', + '["ubuntu-22.04"]', # '["self-hosted", "Linux", "X64"]', # TODO: revert it when we fix self-hosted runners "false", @@ -2115,7 +2115,7 @@ def test_provider_compatibility_checks(labels: tuple[str, ...], expected_outputs ("README.md",), { "needs-mypy": "false", - "mypy-folders": "[]", + "mypy-checks": "[]", }, "main", (), @@ -2125,7 +2125,7 @@ def test_provider_compatibility_checks(labels: tuple[str, ...], expected_outputs ("airflow/cli/file.py",), { "needs-mypy": "true", - "mypy-folders": "['airflow']", + "mypy-checks": "['mypy-airflow']", }, "main", (), @@ -2135,7 +2135,7 @@ def test_provider_compatibility_checks(labels: tuple[str, ...], expected_outputs ("airflow/models/file.py",), { "needs-mypy": "true", - "mypy-folders": "['airflow']", + "mypy-checks": "['mypy-airflow']", }, "main", (), @@ -2145,7 +2145,7 @@ def test_provider_compatibility_checks(labels: tuple[str, ...], expected_outputs ("airflow/providers/a_file.py",), { "needs-mypy": "true", - "mypy-folders": "['providers']", + "mypy-checks": "['mypy-providers']", }, "main", (), @@ -2155,7 +2155,7 @@ def test_provider_compatibility_checks(labels: tuple[str, ...], expected_outputs ("docs/a_file.py",), { "needs-mypy": "true", - "mypy-folders": "['docs']", + "mypy-checks": "['mypy-docs']", }, "main", (), @@ -2165,7 +2165,7 @@ def test_provider_compatibility_checks(labels: tuple[str, ...], expected_outputs ("dev/a_package/a_file.py",), { "needs-mypy": "true", - "mypy-folders": "['airflow', 'providers', 'docs', 'dev']", + "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", }, "main", (), @@ -2175,7 +2175,7 @@ def test_provider_compatibility_checks(labels: tuple[str, ...], expected_outputs ("readme.md",), { "needs-mypy": "true", - "mypy-folders": "['airflow', 'providers', 'docs', 'dev']", + "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", }, "main", ("full tests needed",), diff --git a/dev/breeze/uv.lock b/dev/breeze/uv.lock new file mode 100644 index 0000000000000..666cb37805254 --- /dev/null +++ b/dev/breeze/uv.lock @@ -0,0 +1,1731 @@ +version = 1 +requires-python = ">=3.8, <4" + +[[package]] +name = "anyio" +version = "4.5.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "exceptiongroup", marker = "python_full_version < '3.11'" }, + { name = "idna" }, + { name = "sniffio" }, + { name = "typing-extensions", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/4d/f9/9a7ce600ebe7804daf90d4d48b1c0510a4561ddce43a596be46676f82343/anyio-4.5.2.tar.gz", hash = "sha256:23009af4ed04ce05991845451e11ef02fc7c5ed29179ac9a420e5ad0ac7ddc5b", size = 171293 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/1b/b4/f7e396030e3b11394436358ca258a81d6010106582422f23443c16ca1873/anyio-4.5.2-py3-none-any.whl", hash = "sha256:c011ee36bc1e8ba40e5a81cb9df91925c218fe9b778554e0b56a21e1b5d4716f", size = 89766 }, +] + +[[package]] +name = "apache-airflow-breeze" +version = "0.0.1" +source = { editable = "." } +dependencies = [ + { name = "black" }, + { name = "click" }, + { name = "filelock" }, + { name = "flit" }, + { name = "gitpython" }, + { name = "hatch" }, + { name = "importlib-resources", marker = "python_full_version < '3.9'" }, + { name = "inputimeout" }, + { name = "jinja2" }, + { name = "jsonschema" }, + { name = "packaging" }, + { name = "pipx" }, + { name = "pre-commit" }, + { name = "psutil" }, + { name = "pygithub" }, + { name = "pytest" }, + { name = "pytest-xdist" }, + { name = "pyyaml" }, + { name = "requests" }, + { name = "rich" }, + { name = "rich-click" }, + { name = "semver" }, + { name = "tabulate" }, + { name = "tomli", marker = "python_full_version < '3.11'" }, + { name = "twine" }, +] + +[package.metadata] +requires-dist = [ + { name = "black", specifier = ">=23.11.0" }, + { name = "click", specifier = ">=8.1.7" }, + { name = "filelock", specifier = ">=3.13.0" }, + { name = "flit", specifier = ">=3.5.0" }, + { name = "gitpython", specifier = ">=3.1.40" }, + { name = "hatch", specifier = "==1.9.4" }, + { name = "importlib-resources", marker = "python_full_version < '3.9'", specifier = ">=5.2,!=6.2.0,!=6.3.0,!=6.3.1" }, + { name = "inputimeout", specifier = ">=1.0.4" }, + { name = "jinja2", specifier = ">=3.1.0" }, + { name = "jsonschema", specifier = ">=4.19.1" }, + { name = "packaging", specifier = ">=23.2" }, + { name = "pipx", specifier = ">=1.4.1" }, + { name = "pre-commit", specifier = ">=3.5.0" }, + { name = "psutil", specifier = ">=5.9.6" }, + { name = "pygithub", specifier = ">=2.1.1" }, + { name = "pytest", specifier = ">=8.2,<9" }, + { name = "pytest-xdist", specifier = ">=3.3.1" }, + { name = "pyyaml", specifier = ">=6.0.1" }, + { name = "requests", specifier = ">=2.31.0" }, + { name = "rich", specifier = ">=13.6.0" }, + { name = "rich-click", specifier = ">=1.7.1" }, + { name = "semver", specifier = ">=3.0.2" }, + { name = "tabulate", specifier = ">=0.9.0" }, + { name = "tomli", marker = "python_full_version < '3.11'", specifier = ">=2.0.1" }, + { name = "twine", specifier = ">=4.0.2" }, +] + +[[package]] +name = "argcomplete" +version = "3.5.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/5f/39/27605e133e7f4bb0c8e48c9a6b87101515e3446003e0442761f6a02ac35e/argcomplete-3.5.1.tar.gz", hash = "sha256:eb1ee355aa2557bd3d0145de7b06b2a45b0ce461e1e7813f5d066039ab4177b4", size = 82280 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/f7/be/a606a6701d491cfae75583c80a6583f8abe9c36c0b9666e867e7cdd62fe8/argcomplete-3.5.1-py3-none-any.whl", hash = "sha256:1a1d148bdaa3e3b93454900163403df41448a248af01b6e849edc5ac08e6c363", size = 43498 }, +] + +[[package]] +name = "attrs" +version = "24.2.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/fc/0f/aafca9af9315aee06a89ffde799a10a582fe8de76c563ee80bbcdc08b3fb/attrs-24.2.0.tar.gz", hash = "sha256:5cfb1b9148b5b086569baec03f20d7b6bf3bcacc9a42bebf87ffaaca362f6346", size = 792678 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/6a/21/5b6702a7f963e95456c0de2d495f67bf5fd62840ac655dc451586d23d39a/attrs-24.2.0-py3-none-any.whl", hash = "sha256:81921eb96de3191c8258c199618104dd27ac608d9366f5e35d011eae1867ede2", size = 63001 }, +] + +[[package]] +name = "backports-tarfile" +version = "1.2.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/86/72/cd9b395f25e290e633655a100af28cb253e4393396264a98bd5f5951d50f/backports_tarfile-1.2.0.tar.gz", hash = "sha256:d75e02c268746e1b8144c278978b6e98e85de6ad16f8e4b0844a154557eca991", size = 86406 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b9/fa/123043af240e49752f1c4bd24da5053b6bd00cad78c2be53c0d1e8b975bc/backports.tarfile-1.2.0-py3-none-any.whl", hash = "sha256:77e284d754527b01fb1e6fa8a1afe577858ebe4e9dad8919e34c862cb399bc34", size = 30181 }, +] + +[[package]] +name = "black" +version = "24.8.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "click" }, + { name = "mypy-extensions" }, + { name = "packaging" }, + { name = "pathspec" }, + { name = "platformdirs" }, + { name = "tomli", marker = "python_full_version < '3.11'" }, + { name = "typing-extensions", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/04/b0/46fb0d4e00372f4a86a6f8efa3cb193c9f64863615e39010b1477e010578/black-24.8.0.tar.gz", hash = "sha256:2500945420b6784c38b9ee885af039f5e7471ef284ab03fa35ecdde4688cd83f", size = 644810 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/47/6e/74e29edf1fba3887ed7066930a87f698ffdcd52c5dbc263eabb06061672d/black-24.8.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:09cdeb74d494ec023ded657f7092ba518e8cf78fa8386155e4a03fdcc44679e6", size = 1632092 }, + { url = "https://files.pythonhosted.org/packages/ab/49/575cb6c3faee690b05c9d11ee2e8dba8fbd6d6c134496e644c1feb1b47da/black-24.8.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:81c6742da39f33b08e791da38410f32e27d632260e599df7245cccee2064afeb", size = 1457529 }, + { url = "https://files.pythonhosted.org/packages/7a/b4/d34099e95c437b53d01c4aa37cf93944b233066eb034ccf7897fa4e5f286/black-24.8.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:707a1ca89221bc8a1a64fb5e15ef39cd755633daa672a9db7498d1c19de66a42", size = 1757443 }, + { url = "https://files.pythonhosted.org/packages/87/a0/6d2e4175ef364b8c4b64f8441ba041ed65c63ea1db2720d61494ac711c15/black-24.8.0-cp310-cp310-win_amd64.whl", hash = "sha256:d6417535d99c37cee4091a2f24eb2b6d5ec42b144d50f1f2e436d9fe1916fe1a", size = 1418012 }, + { url = "https://files.pythonhosted.org/packages/08/a6/0a3aa89de9c283556146dc6dbda20cd63a9c94160a6fbdebaf0918e4a3e1/black-24.8.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:fb6e2c0b86bbd43dee042e48059c9ad7830abd5c94b0bc518c0eeec57c3eddc1", size = 1615080 }, + { url = "https://files.pythonhosted.org/packages/db/94/b803d810e14588bb297e565821a947c108390a079e21dbdcb9ab6956cd7a/black-24.8.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:837fd281f1908d0076844bc2b801ad2d369c78c45cf800cad7b61686051041af", size = 1438143 }, + { url = "https://files.pythonhosted.org/packages/a5/b5/f485e1bbe31f768e2e5210f52ea3f432256201289fd1a3c0afda693776b0/black-24.8.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:62e8730977f0b77998029da7971fa896ceefa2c4c4933fcd593fa599ecbf97a4", size = 1738774 }, + { url = "https://files.pythonhosted.org/packages/a8/69/a000fc3736f89d1bdc7f4a879f8aaf516fb03613bb51a0154070383d95d9/black-24.8.0-cp311-cp311-win_amd64.whl", hash = "sha256:72901b4913cbac8972ad911dc4098d5753704d1f3c56e44ae8dce99eecb0e3af", size = 1427503 }, + { url = "https://files.pythonhosted.org/packages/a2/a8/05fb14195cfef32b7c8d4585a44b7499c2a4b205e1662c427b941ed87054/black-24.8.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:7c046c1d1eeb7aea9335da62472481d3bbf3fd986e093cffd35f4385c94ae368", size = 1646132 }, + { url = "https://files.pythonhosted.org/packages/41/77/8d9ce42673e5cb9988f6df73c1c5c1d4e9e788053cccd7f5fb14ef100982/black-24.8.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:649f6d84ccbae73ab767e206772cc2d7a393a001070a4c814a546afd0d423aed", size = 1448665 }, + { url = "https://files.pythonhosted.org/packages/cc/94/eff1ddad2ce1d3cc26c162b3693043c6b6b575f538f602f26fe846dfdc75/black-24.8.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2b59b250fdba5f9a9cd9d0ece6e6d993d91ce877d121d161e4698af3eb9c1018", size = 1762458 }, + { url = "https://files.pythonhosted.org/packages/28/ea/18b8d86a9ca19a6942e4e16759b2fa5fc02bbc0eb33c1b866fcd387640ab/black-24.8.0-cp312-cp312-win_amd64.whl", hash = "sha256:6e55d30d44bed36593c3163b9bc63bf58b3b30e4611e4d88a0c3c239930ed5b2", size = 1436109 }, + { url = "https://files.pythonhosted.org/packages/9f/d4/ae03761ddecc1a37d7e743b89cccbcf3317479ff4b88cfd8818079f890d0/black-24.8.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:505289f17ceda596658ae81b61ebbe2d9b25aa78067035184ed0a9d855d18afd", size = 1617322 }, + { url = "https://files.pythonhosted.org/packages/14/4b/4dfe67eed7f9b1ddca2ec8e4418ea74f0d1dc84d36ea874d618ffa1af7d4/black-24.8.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:b19c9ad992c7883ad84c9b22aaa73562a16b819c1d8db7a1a1a49fb7ec13c7d2", size = 1442108 }, + { url = "https://files.pythonhosted.org/packages/97/14/95b3f91f857034686cae0e73006b8391d76a8142d339b42970eaaf0416ea/black-24.8.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1f13f7f386f86f8121d76599114bb8c17b69d962137fc70efe56137727c7047e", size = 1745786 }, + { url = "https://files.pythonhosted.org/packages/95/54/68b8883c8aa258a6dde958cd5bdfada8382bec47c5162f4a01e66d839af1/black-24.8.0-cp38-cp38-win_amd64.whl", hash = "sha256:f490dbd59680d809ca31efdae20e634f3fae27fba3ce0ba3208333b713bc3920", size = 1426754 }, + { url = "https://files.pythonhosted.org/packages/13/b2/b3f24fdbb46f0e7ef6238e131f13572ee8279b70f237f221dd168a9dba1a/black-24.8.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:eab4dd44ce80dea27dc69db40dab62d4ca96112f87996bca68cd75639aeb2e4c", size = 1631706 }, + { url = "https://files.pythonhosted.org/packages/d9/35/31010981e4a05202a84a3116423970fd1a59d2eda4ac0b3570fbb7029ddc/black-24.8.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:3c4285573d4897a7610054af5a890bde7c65cb466040c5f0c8b732812d7f0e5e", size = 1457429 }, + { url = "https://files.pythonhosted.org/packages/27/25/3f706b4f044dd569a20a4835c3b733dedea38d83d2ee0beb8178a6d44945/black-24.8.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9e84e33b37be070ba135176c123ae52a51f82306def9f7d063ee302ecab2cf47", size = 1756488 }, + { url = "https://files.pythonhosted.org/packages/63/72/79375cd8277cbf1c5670914e6bd4c1b15dea2c8f8e906dc21c448d0535f0/black-24.8.0-cp39-cp39-win_amd64.whl", hash = "sha256:73bbf84ed136e45d451a260c6b73ed674652f90a2b3211d6a35e78054563a9bb", size = 1417721 }, + { url = "https://files.pythonhosted.org/packages/27/1e/83fa8a787180e1632c3d831f7e58994d7aaf23a0961320d21e84f922f919/black-24.8.0-py3-none-any.whl", hash = "sha256:972085c618ee94f402da1af548a4f218c754ea7e5dc70acb168bfaca4c2542ed", size = 206504 }, +] + +[[package]] +name = "certifi" +version = "2024.8.30" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b0/ee/9b19140fe824b367c04c5e1b369942dd754c4c5462d5674002f75c4dedc1/certifi-2024.8.30.tar.gz", hash = "sha256:bec941d2aa8195e248a60b31ff9f0558284cf01a52591ceda73ea9afffd69fd9", size = 168507 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/12/90/3c9ff0512038035f59d279fddeb79f5f1eccd8859f06d6163c58798b9487/certifi-2024.8.30-py3-none-any.whl", hash = "sha256:922820b53db7a7257ffbda3f597266d435245903d80737e34f8a45ff3e3230d8", size = 167321 }, +] + +[[package]] +name = "cffi" +version = "1.17.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pycparser" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/fc/97/c783634659c2920c3fc70419e3af40972dbaf758daa229a7d6ea6135c90d/cffi-1.17.1.tar.gz", hash = "sha256:1c39c6016c32bc48dd54561950ebd6836e1670f2ae46128f67cf49e789c52824", size = 516621 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/90/07/f44ca684db4e4f08a3fdc6eeb9a0d15dc6883efc7b8c90357fdbf74e186c/cffi-1.17.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:df8b1c11f177bc2313ec4b2d46baec87a5f3e71fc8b45dab2ee7cae86d9aba14", size = 182191 }, + { url = "https://files.pythonhosted.org/packages/08/fd/cc2fedbd887223f9f5d170c96e57cbf655df9831a6546c1727ae13fa977a/cffi-1.17.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8f2cdc858323644ab277e9bb925ad72ae0e67f69e804f4898c070998d50b1a67", size = 178592 }, + { url = "https://files.pythonhosted.org/packages/de/cc/4635c320081c78d6ffc2cab0a76025b691a91204f4aa317d568ff9280a2d/cffi-1.17.1-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:edae79245293e15384b51f88b00613ba9f7198016a5948b5dddf4917d4d26382", size = 426024 }, + { url = "https://files.pythonhosted.org/packages/b6/7b/3b2b250f3aab91abe5f8a51ada1b717935fdaec53f790ad4100fe2ec64d1/cffi-1.17.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:45398b671ac6d70e67da8e4224a065cec6a93541bb7aebe1b198a61b58c7b702", size = 448188 }, + { url = "https://files.pythonhosted.org/packages/d3/48/1b9283ebbf0ec065148d8de05d647a986c5f22586b18120020452fff8f5d/cffi-1.17.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ad9413ccdeda48c5afdae7e4fa2192157e991ff761e7ab8fdd8926f40b160cc3", size = 455571 }, + { url = "https://files.pythonhosted.org/packages/40/87/3b8452525437b40f39ca7ff70276679772ee7e8b394934ff60e63b7b090c/cffi-1.17.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5da5719280082ac6bd9aa7becb3938dc9f9cbd57fac7d2871717b1feb0902ab6", size = 436687 }, + { url = "https://files.pythonhosted.org/packages/8d/fb/4da72871d177d63649ac449aec2e8a29efe0274035880c7af59101ca2232/cffi-1.17.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bb1a08b8008b281856e5971307cc386a8e9c5b625ac297e853d36da6efe9c17", size = 446211 }, + { url = "https://files.pythonhosted.org/packages/ab/a0/62f00bcb411332106c02b663b26f3545a9ef136f80d5df746c05878f8c4b/cffi-1.17.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:045d61c734659cc045141be4bae381a41d89b741f795af1dd018bfb532fd0df8", size = 461325 }, + { url = "https://files.pythonhosted.org/packages/36/83/76127035ed2e7e27b0787604d99da630ac3123bfb02d8e80c633f218a11d/cffi-1.17.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:6883e737d7d9e4899a8a695e00ec36bd4e5e4f18fabe0aca0efe0a4b44cdb13e", size = 438784 }, + { url = "https://files.pythonhosted.org/packages/21/81/a6cd025db2f08ac88b901b745c163d884641909641f9b826e8cb87645942/cffi-1.17.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:6b8b4a92e1c65048ff98cfe1f735ef8f1ceb72e3d5f0c25fdb12087a23da22be", size = 461564 }, + { url = "https://files.pythonhosted.org/packages/f8/fe/4d41c2f200c4a457933dbd98d3cf4e911870877bd94d9656cc0fcb390681/cffi-1.17.1-cp310-cp310-win32.whl", hash = "sha256:c9c3d058ebabb74db66e431095118094d06abf53284d9c81f27300d0e0d8bc7c", size = 171804 }, + { url = "https://files.pythonhosted.org/packages/d1/b6/0b0f5ab93b0df4acc49cae758c81fe4e5ef26c3ae2e10cc69249dfd8b3ab/cffi-1.17.1-cp310-cp310-win_amd64.whl", hash = "sha256:0f048dcf80db46f0098ccac01132761580d28e28bc0f78ae0d58048063317e15", size = 181299 }, + { url = "https://files.pythonhosted.org/packages/6b/f4/927e3a8899e52a27fa57a48607ff7dc91a9ebe97399b357b85a0c7892e00/cffi-1.17.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a45e3c6913c5b87b3ff120dcdc03f6131fa0065027d0ed7ee6190736a74cd401", size = 182264 }, + { url = "https://files.pythonhosted.org/packages/6c/f5/6c3a8efe5f503175aaddcbea6ad0d2c96dad6f5abb205750d1b3df44ef29/cffi-1.17.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:30c5e0cb5ae493c04c8b42916e52ca38079f1b235c2f8ae5f4527b963c401caf", size = 178651 }, + { url = "https://files.pythonhosted.org/packages/94/dd/a3f0118e688d1b1a57553da23b16bdade96d2f9bcda4d32e7d2838047ff7/cffi-1.17.1-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f75c7ab1f9e4aca5414ed4d8e5c0e303a34f4421f8a0d47a4d019ceff0ab6af4", size = 445259 }, + { url = "https://files.pythonhosted.org/packages/2e/ea/70ce63780f096e16ce8588efe039d3c4f91deb1dc01e9c73a287939c79a6/cffi-1.17.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a1ed2dd2972641495a3ec98445e09766f077aee98a1c896dcb4ad0d303628e41", size = 469200 }, + { url = "https://files.pythonhosted.org/packages/1c/a0/a4fa9f4f781bda074c3ddd57a572b060fa0df7655d2a4247bbe277200146/cffi-1.17.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:46bf43160c1a35f7ec506d254e5c890f3c03648a4dbac12d624e4490a7046cd1", size = 477235 }, + { url = "https://files.pythonhosted.org/packages/62/12/ce8710b5b8affbcdd5c6e367217c242524ad17a02fe5beec3ee339f69f85/cffi-1.17.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a24ed04c8ffd54b0729c07cee15a81d964e6fee0e3d4d342a27b020d22959dc6", size = 459721 }, + { url = "https://files.pythonhosted.org/packages/ff/6b/d45873c5e0242196f042d555526f92aa9e0c32355a1be1ff8c27f077fd37/cffi-1.17.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:610faea79c43e44c71e1ec53a554553fa22321b65fae24889706c0a84d4ad86d", size = 467242 }, + { url = "https://files.pythonhosted.org/packages/1a/52/d9a0e523a572fbccf2955f5abe883cfa8bcc570d7faeee06336fbd50c9fc/cffi-1.17.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:a9b15d491f3ad5d692e11f6b71f7857e7835eb677955c00cc0aefcd0669adaf6", size = 477999 }, + { url = "https://files.pythonhosted.org/packages/44/74/f2a2460684a1a2d00ca799ad880d54652841a780c4c97b87754f660c7603/cffi-1.17.1-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:de2ea4b5833625383e464549fec1bc395c1bdeeb5f25c4a3a82b5a8c756ec22f", size = 454242 }, + { url = "https://files.pythonhosted.org/packages/f8/4a/34599cac7dfcd888ff54e801afe06a19c17787dfd94495ab0c8d35fe99fb/cffi-1.17.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:fc48c783f9c87e60831201f2cce7f3b2e4846bf4d8728eabe54d60700b318a0b", size = 478604 }, + { url = "https://files.pythonhosted.org/packages/34/33/e1b8a1ba29025adbdcda5fb3a36f94c03d771c1b7b12f726ff7fef2ebe36/cffi-1.17.1-cp311-cp311-win32.whl", hash = "sha256:85a950a4ac9c359340d5963966e3e0a94a676bd6245a4b55bc43949eee26a655", size = 171727 }, + { url = "https://files.pythonhosted.org/packages/3d/97/50228be003bb2802627d28ec0627837ac0bf35c90cf769812056f235b2d1/cffi-1.17.1-cp311-cp311-win_amd64.whl", hash = "sha256:caaf0640ef5f5517f49bc275eca1406b0ffa6aa184892812030f04c2abf589a0", size = 181400 }, + { url = "https://files.pythonhosted.org/packages/5a/84/e94227139ee5fb4d600a7a4927f322e1d4aea6fdc50bd3fca8493caba23f/cffi-1.17.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:805b4371bf7197c329fcb3ead37e710d1bca9da5d583f5073b799d5c5bd1eee4", size = 183178 }, + { url = "https://files.pythonhosted.org/packages/da/ee/fb72c2b48656111c4ef27f0f91da355e130a923473bf5ee75c5643d00cca/cffi-1.17.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:733e99bc2df47476e3848417c5a4540522f234dfd4ef3ab7fafdf555b082ec0c", size = 178840 }, + { url = "https://files.pythonhosted.org/packages/cc/b6/db007700f67d151abadf508cbfd6a1884f57eab90b1bb985c4c8c02b0f28/cffi-1.17.1-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1257bdabf294dceb59f5e70c64a3e2f462c30c7ad68092d01bbbfb1c16b1ba36", size = 454803 }, + { url = "https://files.pythonhosted.org/packages/1a/df/f8d151540d8c200eb1c6fba8cd0dfd40904f1b0682ea705c36e6c2e97ab3/cffi-1.17.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:da95af8214998d77a98cc14e3a3bd00aa191526343078b530ceb0bd710fb48a5", size = 478850 }, + { url = "https://files.pythonhosted.org/packages/28/c0/b31116332a547fd2677ae5b78a2ef662dfc8023d67f41b2a83f7c2aa78b1/cffi-1.17.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d63afe322132c194cf832bfec0dc69a99fb9bb6bbd550f161a49e9e855cc78ff", size = 485729 }, + { url = "https://files.pythonhosted.org/packages/91/2b/9a1ddfa5c7f13cab007a2c9cc295b70fbbda7cb10a286aa6810338e60ea1/cffi-1.17.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f79fc4fc25f1c8698ff97788206bb3c2598949bfe0fef03d299eb1b5356ada99", size = 471256 }, + { url = "https://files.pythonhosted.org/packages/b2/d5/da47df7004cb17e4955df6a43d14b3b4ae77737dff8bf7f8f333196717bf/cffi-1.17.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b62ce867176a75d03a665bad002af8e6d54644fad99a3c70905c543130e39d93", size = 479424 }, + { url = "https://files.pythonhosted.org/packages/0b/ac/2a28bcf513e93a219c8a4e8e125534f4f6db03e3179ba1c45e949b76212c/cffi-1.17.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:386c8bf53c502fff58903061338ce4f4950cbdcb23e2902d86c0f722b786bbe3", size = 484568 }, + { url = "https://files.pythonhosted.org/packages/d4/38/ca8a4f639065f14ae0f1d9751e70447a261f1a30fa7547a828ae08142465/cffi-1.17.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4ceb10419a9adf4460ea14cfd6bc43d08701f0835e979bf821052f1805850fe8", size = 488736 }, + { url = "https://files.pythonhosted.org/packages/86/c5/28b2d6f799ec0bdecf44dced2ec5ed43e0eb63097b0f58c293583b406582/cffi-1.17.1-cp312-cp312-win32.whl", hash = "sha256:a08d7e755f8ed21095a310a693525137cfe756ce62d066e53f502a83dc550f65", size = 172448 }, + { url = "https://files.pythonhosted.org/packages/50/b9/db34c4755a7bd1cb2d1603ac3863f22bcecbd1ba29e5ee841a4bc510b294/cffi-1.17.1-cp312-cp312-win_amd64.whl", hash = "sha256:51392eae71afec0d0c8fb1a53b204dbb3bcabcb3c9b807eedf3e1e6ccf2de903", size = 181976 }, + { url = "https://files.pythonhosted.org/packages/8d/f8/dd6c246b148639254dad4d6803eb6a54e8c85c6e11ec9df2cffa87571dbe/cffi-1.17.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f3a2b4222ce6b60e2e8b337bb9596923045681d71e5a082783484d845390938e", size = 182989 }, + { url = "https://files.pythonhosted.org/packages/8b/f1/672d303ddf17c24fc83afd712316fda78dc6fce1cd53011b839483e1ecc8/cffi-1.17.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:0984a4925a435b1da406122d4d7968dd861c1385afe3b45ba82b750f229811e2", size = 178802 }, + { url = "https://files.pythonhosted.org/packages/0e/2d/eab2e858a91fdff70533cab61dcff4a1f55ec60425832ddfdc9cd36bc8af/cffi-1.17.1-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d01b12eeeb4427d3110de311e1774046ad344f5b1a7403101878976ecd7a10f3", size = 454792 }, + { url = "https://files.pythonhosted.org/packages/75/b2/fbaec7c4455c604e29388d55599b99ebcc250a60050610fadde58932b7ee/cffi-1.17.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:706510fe141c86a69c8ddc029c7910003a17353970cff3b904ff0686a5927683", size = 478893 }, + { url = "https://files.pythonhosted.org/packages/4f/b7/6e4a2162178bf1935c336d4da8a9352cccab4d3a5d7914065490f08c0690/cffi-1.17.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:de55b766c7aa2e2a3092c51e0483d700341182f08e67c63630d5b6f200bb28e5", size = 485810 }, + { url = "https://files.pythonhosted.org/packages/c7/8a/1d0e4a9c26e54746dc08c2c6c037889124d4f59dffd853a659fa545f1b40/cffi-1.17.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c59d6e989d07460165cc5ad3c61f9fd8f1b4796eacbd81cee78957842b834af4", size = 471200 }, + { url = "https://files.pythonhosted.org/packages/26/9f/1aab65a6c0db35f43c4d1b4f580e8df53914310afc10ae0397d29d697af4/cffi-1.17.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dd398dbc6773384a17fe0d3e7eeb8d1a21c2200473ee6806bb5e6a8e62bb73dd", size = 479447 }, + { url = "https://files.pythonhosted.org/packages/5f/e4/fb8b3dd8dc0e98edf1135ff067ae070bb32ef9d509d6cb0f538cd6f7483f/cffi-1.17.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:3edc8d958eb099c634dace3c7e16560ae474aa3803a5df240542b305d14e14ed", size = 484358 }, + { url = "https://files.pythonhosted.org/packages/f1/47/d7145bf2dc04684935d57d67dff9d6d795b2ba2796806bb109864be3a151/cffi-1.17.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:72e72408cad3d5419375fc87d289076ee319835bdfa2caad331e377589aebba9", size = 488469 }, + { url = "https://files.pythonhosted.org/packages/bf/ee/f94057fa6426481d663b88637a9a10e859e492c73d0384514a17d78ee205/cffi-1.17.1-cp313-cp313-win32.whl", hash = "sha256:e03eab0a8677fa80d646b5ddece1cbeaf556c313dcfac435ba11f107ba117b5d", size = 172475 }, + { url = "https://files.pythonhosted.org/packages/7c/fc/6a8cb64e5f0324877d503c854da15d76c1e50eb722e320b15345c4d0c6de/cffi-1.17.1-cp313-cp313-win_amd64.whl", hash = "sha256:f6a16c31041f09ead72d69f583767292f750d24913dadacf5756b966aacb3f1a", size = 182009 }, + { url = "https://files.pythonhosted.org/packages/48/08/15bf6b43ae9bd06f6b00ad8a91f5a8fe1069d4c9fab550a866755402724e/cffi-1.17.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:636062ea65bd0195bc012fea9321aca499c0504409f413dc88af450b57ffd03b", size = 182457 }, + { url = "https://files.pythonhosted.org/packages/c2/5b/f1523dd545f92f7df468e5f653ffa4df30ac222f3c884e51e139878f1cb5/cffi-1.17.1-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c7eac2ef9b63c79431bc4b25f1cd649d7f061a28808cbc6c47b534bd789ef964", size = 425932 }, + { url = "https://files.pythonhosted.org/packages/53/93/7e547ab4105969cc8c93b38a667b82a835dd2cc78f3a7dad6130cfd41e1d/cffi-1.17.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e221cf152cff04059d011ee126477f0d9588303eb57e88923578ace7baad17f9", size = 448585 }, + { url = "https://files.pythonhosted.org/packages/56/c4/a308f2c332006206bb511de219efeff090e9d63529ba0a77aae72e82248b/cffi-1.17.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:31000ec67d4221a71bd3f67df918b1f88f676f1c3b535a7eb473255fdc0b83fc", size = 456268 }, + { url = "https://files.pythonhosted.org/packages/ca/5b/b63681518265f2f4060d2b60755c1c77ec89e5e045fc3773b72735ddaad5/cffi-1.17.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6f17be4345073b0a7b8ea599688f692ac3ef23ce28e5df79c04de519dbc4912c", size = 436592 }, + { url = "https://files.pythonhosted.org/packages/bb/19/b51af9f4a4faa4a8ac5a0e5d5c2522dcd9703d07fac69da34a36c4d960d3/cffi-1.17.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0e2b1fac190ae3ebfe37b979cc1ce69c81f4e4fe5746bb401dca63a9062cdaf1", size = 446512 }, + { url = "https://files.pythonhosted.org/packages/e2/63/2bed8323890cb613bbecda807688a31ed11a7fe7afe31f8faaae0206a9a3/cffi-1.17.1-cp38-cp38-win32.whl", hash = "sha256:7596d6620d3fa590f677e9ee430df2958d2d6d6de2feeae5b20e82c00b76fbf8", size = 171576 }, + { url = "https://files.pythonhosted.org/packages/2f/70/80c33b044ebc79527447fd4fbc5455d514c3bb840dede4455de97da39b4d/cffi-1.17.1-cp38-cp38-win_amd64.whl", hash = "sha256:78122be759c3f8a014ce010908ae03364d00a1f81ab5c7f4a7a5120607ea56e1", size = 181229 }, + { url = "https://files.pythonhosted.org/packages/b9/ea/8bb50596b8ffbc49ddd7a1ad305035daa770202a6b782fc164647c2673ad/cffi-1.17.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:b2ab587605f4ba0bf81dc0cb08a41bd1c0a5906bd59243d56bad7668a6fc6c16", size = 182220 }, + { url = "https://files.pythonhosted.org/packages/ae/11/e77c8cd24f58285a82c23af484cf5b124a376b32644e445960d1a4654c3a/cffi-1.17.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:28b16024becceed8c6dfbc75629e27788d8a3f9030691a1dbf9821a128b22c36", size = 178605 }, + { url = "https://files.pythonhosted.org/packages/ed/65/25a8dc32c53bf5b7b6c2686b42ae2ad58743f7ff644844af7cdb29b49361/cffi-1.17.1-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1d599671f396c4723d016dbddb72fe8e0397082b0a77a4fab8028923bec050e8", size = 424910 }, + { url = "https://files.pythonhosted.org/packages/42/7a/9d086fab7c66bd7c4d0f27c57a1b6b068ced810afc498cc8c49e0088661c/cffi-1.17.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ca74b8dbe6e8e8263c0ffd60277de77dcee6c837a3d0881d8c1ead7268c9e576", size = 447200 }, + { url = "https://files.pythonhosted.org/packages/da/63/1785ced118ce92a993b0ec9e0d0ac8dc3e5dbfbcaa81135be56c69cabbb6/cffi-1.17.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f7f5baafcc48261359e14bcd6d9bff6d4b28d9103847c9e136694cb0501aef87", size = 454565 }, + { url = "https://files.pythonhosted.org/packages/74/06/90b8a44abf3556599cdec107f7290277ae8901a58f75e6fe8f970cd72418/cffi-1.17.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:98e3969bcff97cae1b2def8ba499ea3d6f31ddfdb7635374834cf89a1a08ecf0", size = 435635 }, + { url = "https://files.pythonhosted.org/packages/bd/62/a1f468e5708a70b1d86ead5bab5520861d9c7eacce4a885ded9faa7729c3/cffi-1.17.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cdf5ce3acdfd1661132f2a9c19cac174758dc2352bfe37d98aa7512c6b7178b3", size = 445218 }, + { url = "https://files.pythonhosted.org/packages/5b/95/b34462f3ccb09c2594aa782d90a90b045de4ff1f70148ee79c69d37a0a5a/cffi-1.17.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:9755e4345d1ec879e3849e62222a18c7174d65a6a92d5b346b1863912168b595", size = 460486 }, + { url = "https://files.pythonhosted.org/packages/fc/fc/a1e4bebd8d680febd29cf6c8a40067182b64f00c7d105f8f26b5bc54317b/cffi-1.17.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:f1e22e8c4419538cb197e4dd60acc919d7696e5ef98ee4da4e01d3f8cfa4cc5a", size = 437911 }, + { url = "https://files.pythonhosted.org/packages/e6/c3/21cab7a6154b6a5ea330ae80de386e7665254835b9e98ecc1340b3a7de9a/cffi-1.17.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:c03e868a0b3bc35839ba98e74211ed2b05d2119be4e8a0f224fba9384f1fe02e", size = 460632 }, + { url = "https://files.pythonhosted.org/packages/cb/b5/fd9f8b5a84010ca169ee49f4e4ad6f8c05f4e3545b72ee041dbbcb159882/cffi-1.17.1-cp39-cp39-win32.whl", hash = "sha256:e31ae45bc2e29f6b2abd0de1cc3b9d5205aa847cafaecb8af1476a609a2f6eb7", size = 171820 }, + { url = "https://files.pythonhosted.org/packages/8c/52/b08750ce0bce45c143e1b5d7357ee8c55341b52bdef4b0f081af1eb248c2/cffi-1.17.1-cp39-cp39-win_amd64.whl", hash = "sha256:d016c76bdd850f3c626af19b0542c9677ba156e4ee4fccfdd7848803533ef662", size = 181290 }, +] + +[[package]] +name = "cfgv" +version = "3.4.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/11/74/539e56497d9bd1d484fd863dd69cbbfa653cd2aa27abfe35653494d85e94/cfgv-3.4.0.tar.gz", hash = "sha256:e52591d4c5f5dead8e0f673fb16db7949d2cfb3f7da4582893288f0ded8fe560", size = 7114 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c5/55/51844dd50c4fc7a33b653bfaba4c2456f06955289ca770a5dbd5fd267374/cfgv-3.4.0-py2.py3-none-any.whl", hash = "sha256:b7265b1f29fd3316bfcd2b330d63d024f2bfd8bcb8b0272f8e19a504856c48f9", size = 7249 }, +] + +[[package]] +name = "charset-normalizer" +version = "3.4.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f2/4f/e1808dc01273379acc506d18f1504eb2d299bd4131743b9fc54d7be4df1e/charset_normalizer-3.4.0.tar.gz", hash = "sha256:223217c3d4f82c3ac5e29032b3f1c2eb0fb591b72161f86d93f5719079dae93e", size = 106620 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/69/8b/825cc84cf13a28bfbcba7c416ec22bf85a9584971be15b21dd8300c65b7f/charset_normalizer-3.4.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:4f9fc98dad6c2eaa32fc3af1417d95b5e3d08aff968df0cd320066def971f9a6", size = 196363 }, + { url = "https://files.pythonhosted.org/packages/23/81/d7eef6a99e42c77f444fdd7bc894b0ceca6c3a95c51239e74a722039521c/charset_normalizer-3.4.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:0de7b687289d3c1b3e8660d0741874abe7888100efe14bd0f9fd7141bcbda92b", size = 125639 }, + { url = "https://files.pythonhosted.org/packages/21/67/b4564d81f48042f520c948abac7079356e94b30cb8ffb22e747532cf469d/charset_normalizer-3.4.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:5ed2e36c3e9b4f21dd9422f6893dec0abf2cca553af509b10cd630f878d3eb99", size = 120451 }, + { url = "https://files.pythonhosted.org/packages/c2/72/12a7f0943dd71fb5b4e7b55c41327ac0a1663046a868ee4d0d8e9c369b85/charset_normalizer-3.4.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:40d3ff7fc90b98c637bda91c89d51264a3dcf210cade3a2c6f838c7268d7a4ca", size = 140041 }, + { url = "https://files.pythonhosted.org/packages/67/56/fa28c2c3e31217c4c52158537a2cf5d98a6c1e89d31faf476c89391cd16b/charset_normalizer-3.4.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1110e22af8ca26b90bd6364fe4c763329b0ebf1ee213ba32b68c73de5752323d", size = 150333 }, + { url = "https://files.pythonhosted.org/packages/f9/d2/466a9be1f32d89eb1554cf84073a5ed9262047acee1ab39cbaefc19635d2/charset_normalizer-3.4.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:86f4e8cca779080f66ff4f191a685ced73d2f72d50216f7112185dc02b90b9b7", size = 142921 }, + { url = "https://files.pythonhosted.org/packages/f8/01/344ec40cf5d85c1da3c1f57566c59e0c9b56bcc5566c08804a95a6cc8257/charset_normalizer-3.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7f683ddc7eedd742e2889d2bfb96d69573fde1d92fcb811979cdb7165bb9c7d3", size = 144785 }, + { url = "https://files.pythonhosted.org/packages/73/8b/2102692cb6d7e9f03b9a33a710e0164cadfce312872e3efc7cfe22ed26b4/charset_normalizer-3.4.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:27623ba66c183eca01bf9ff833875b459cad267aeeb044477fedac35e19ba907", size = 146631 }, + { url = "https://files.pythonhosted.org/packages/d8/96/cc2c1b5d994119ce9f088a9a0c3ebd489d360a2eb058e2c8049f27092847/charset_normalizer-3.4.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:f606a1881d2663630ea5b8ce2efe2111740df4b687bd78b34a8131baa007f79b", size = 140867 }, + { url = "https://files.pythonhosted.org/packages/c9/27/cde291783715b8ec30a61c810d0120411844bc4c23b50189b81188b273db/charset_normalizer-3.4.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:0b309d1747110feb25d7ed6b01afdec269c647d382c857ef4663bbe6ad95a912", size = 149273 }, + { url = "https://files.pythonhosted.org/packages/3a/a4/8633b0fc1a2d1834d5393dafecce4a1cc56727bfd82b4dc18fc92f0d3cc3/charset_normalizer-3.4.0-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:136815f06a3ae311fae551c3df1f998a1ebd01ddd424aa5603a4336997629e95", size = 152437 }, + { url = "https://files.pythonhosted.org/packages/64/ea/69af161062166b5975ccbb0961fd2384853190c70786f288684490913bf5/charset_normalizer-3.4.0-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:14215b71a762336254351b00ec720a8e85cada43b987da5a042e4ce3e82bd68e", size = 150087 }, + { url = "https://files.pythonhosted.org/packages/3b/fd/e60a9d9fd967f4ad5a92810138192f825d77b4fa2a557990fd575a47695b/charset_normalizer-3.4.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:79983512b108e4a164b9c8d34de3992f76d48cadc9554c9e60b43f308988aabe", size = 145142 }, + { url = "https://files.pythonhosted.org/packages/6d/02/8cb0988a1e49ac9ce2eed1e07b77ff118f2923e9ebd0ede41ba85f2dcb04/charset_normalizer-3.4.0-cp310-cp310-win32.whl", hash = "sha256:c94057af19bc953643a33581844649a7fdab902624d2eb739738a30e2b3e60fc", size = 94701 }, + { url = "https://files.pythonhosted.org/packages/d6/20/f1d4670a8a723c46be695dff449d86d6092916f9e99c53051954ee33a1bc/charset_normalizer-3.4.0-cp310-cp310-win_amd64.whl", hash = "sha256:55f56e2ebd4e3bc50442fbc0888c9d8c94e4e06a933804e2af3e89e2f9c1c749", size = 102191 }, + { url = "https://files.pythonhosted.org/packages/9c/61/73589dcc7a719582bf56aae309b6103d2762b526bffe189d635a7fcfd998/charset_normalizer-3.4.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:0d99dd8ff461990f12d6e42c7347fd9ab2532fb70e9621ba520f9e8637161d7c", size = 193339 }, + { url = "https://files.pythonhosted.org/packages/77/d5/8c982d58144de49f59571f940e329ad6e8615e1e82ef84584c5eeb5e1d72/charset_normalizer-3.4.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:c57516e58fd17d03ebe67e181a4e4e2ccab1168f8c2976c6a334d4f819fe5944", size = 124366 }, + { url = "https://files.pythonhosted.org/packages/bf/19/411a64f01ee971bed3231111b69eb56f9331a769072de479eae7de52296d/charset_normalizer-3.4.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:6dba5d19c4dfab08e58d5b36304b3f92f3bd5d42c1a3fa37b5ba5cdf6dfcbcee", size = 118874 }, + { url = "https://files.pythonhosted.org/packages/4c/92/97509850f0d00e9f14a46bc751daabd0ad7765cff29cdfb66c68b6dad57f/charset_normalizer-3.4.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bf4475b82be41b07cc5e5ff94810e6a01f276e37c2d55571e3fe175e467a1a1c", size = 138243 }, + { url = "https://files.pythonhosted.org/packages/e2/29/d227805bff72ed6d6cb1ce08eec707f7cfbd9868044893617eb331f16295/charset_normalizer-3.4.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ce031db0408e487fd2775d745ce30a7cd2923667cf3b69d48d219f1d8f5ddeb6", size = 148676 }, + { url = "https://files.pythonhosted.org/packages/13/bc/87c2c9f2c144bedfa62f894c3007cd4530ba4b5351acb10dc786428a50f0/charset_normalizer-3.4.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8ff4e7cdfdb1ab5698e675ca622e72d58a6fa2a8aa58195de0c0061288e6e3ea", size = 141289 }, + { url = "https://files.pythonhosted.org/packages/eb/5b/6f10bad0f6461fa272bfbbdf5d0023b5fb9bc6217c92bf068fa5a99820f5/charset_normalizer-3.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3710a9751938947e6327ea9f3ea6332a09bf0ba0c09cae9cb1f250bd1f1549bc", size = 142585 }, + { url = "https://files.pythonhosted.org/packages/3b/a0/a68980ab8a1f45a36d9745d35049c1af57d27255eff8c907e3add84cf68f/charset_normalizer-3.4.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:82357d85de703176b5587dbe6ade8ff67f9f69a41c0733cf2425378b49954de5", size = 144408 }, + { url = "https://files.pythonhosted.org/packages/d7/a1/493919799446464ed0299c8eef3c3fad0daf1c3cd48bff9263c731b0d9e2/charset_normalizer-3.4.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:47334db71978b23ebcf3c0f9f5ee98b8d65992b65c9c4f2d34c2eaf5bcaf0594", size = 139076 }, + { url = "https://files.pythonhosted.org/packages/fb/9d/9c13753a5a6e0db4a0a6edb1cef7aee39859177b64e1a1e748a6e3ba62c2/charset_normalizer-3.4.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:8ce7fd6767a1cc5a92a639b391891bf1c268b03ec7e021c7d6d902285259685c", size = 146874 }, + { url = "https://files.pythonhosted.org/packages/75/d2/0ab54463d3410709c09266dfb416d032a08f97fd7d60e94b8c6ef54ae14b/charset_normalizer-3.4.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:f1a2f519ae173b5b6a2c9d5fa3116ce16e48b3462c8b96dfdded11055e3d6365", size = 150871 }, + { url = "https://files.pythonhosted.org/packages/8d/c9/27e41d481557be53d51e60750b85aa40eaf52b841946b3cdeff363105737/charset_normalizer-3.4.0-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:63bc5c4ae26e4bc6be6469943b8253c0fd4e4186c43ad46e713ea61a0ba49129", size = 148546 }, + { url = "https://files.pythonhosted.org/packages/ee/44/4f62042ca8cdc0cabf87c0fc00ae27cd8b53ab68be3605ba6d071f742ad3/charset_normalizer-3.4.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:bcb4f8ea87d03bc51ad04add8ceaf9b0f085ac045ab4d74e73bbc2dc033f0236", size = 143048 }, + { url = "https://files.pythonhosted.org/packages/01/f8/38842422988b795220eb8038745d27a675ce066e2ada79516c118f291f07/charset_normalizer-3.4.0-cp311-cp311-win32.whl", hash = "sha256:9ae4ef0b3f6b41bad6366fb0ea4fc1d7ed051528e113a60fa2a65a9abb5b1d99", size = 94389 }, + { url = "https://files.pythonhosted.org/packages/0b/6e/b13bd47fa9023b3699e94abf565b5a2f0b0be6e9ddac9812182596ee62e4/charset_normalizer-3.4.0-cp311-cp311-win_amd64.whl", hash = "sha256:cee4373f4d3ad28f1ab6290684d8e2ebdb9e7a1b74fdc39e4c211995f77bec27", size = 101752 }, + { url = "https://files.pythonhosted.org/packages/d3/0b/4b7a70987abf9b8196845806198975b6aab4ce016632f817ad758a5aa056/charset_normalizer-3.4.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:0713f3adb9d03d49d365b70b84775d0a0d18e4ab08d12bc46baa6132ba78aaf6", size = 194445 }, + { url = "https://files.pythonhosted.org/packages/50/89/354cc56cf4dd2449715bc9a0f54f3aef3dc700d2d62d1fa5bbea53b13426/charset_normalizer-3.4.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:de7376c29d95d6719048c194a9cf1a1b0393fbe8488a22008610b0361d834ecf", size = 125275 }, + { url = "https://files.pythonhosted.org/packages/fa/44/b730e2a2580110ced837ac083d8ad222343c96bb6b66e9e4e706e4d0b6df/charset_normalizer-3.4.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:4a51b48f42d9358460b78725283f04bddaf44a9358197b889657deba38f329db", size = 119020 }, + { url = "https://files.pythonhosted.org/packages/9d/e4/9263b8240ed9472a2ae7ddc3e516e71ef46617fe40eaa51221ccd4ad9a27/charset_normalizer-3.4.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b295729485b06c1a0683af02a9e42d2caa9db04a373dc38a6a58cdd1e8abddf1", size = 139128 }, + { url = "https://files.pythonhosted.org/packages/6b/e3/9f73e779315a54334240353eaea75854a9a690f3f580e4bd85d977cb2204/charset_normalizer-3.4.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ee803480535c44e7f5ad00788526da7d85525cfefaf8acf8ab9a310000be4b03", size = 149277 }, + { url = "https://files.pythonhosted.org/packages/1a/cf/f1f50c2f295312edb8a548d3fa56a5c923b146cd3f24114d5adb7e7be558/charset_normalizer-3.4.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3d59d125ffbd6d552765510e3f31ed75ebac2c7470c7274195b9161a32350284", size = 142174 }, + { url = "https://files.pythonhosted.org/packages/16/92/92a76dc2ff3a12e69ba94e7e05168d37d0345fa08c87e1fe24d0c2a42223/charset_normalizer-3.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8cda06946eac330cbe6598f77bb54e690b4ca93f593dee1568ad22b04f347c15", size = 143838 }, + { url = "https://files.pythonhosted.org/packages/a4/01/2117ff2b1dfc61695daf2babe4a874bca328489afa85952440b59819e9d7/charset_normalizer-3.4.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:07afec21bbbbf8a5cc3651aa96b980afe2526e7f048fdfb7f1014d84acc8b6d8", size = 146149 }, + { url = "https://files.pythonhosted.org/packages/f6/9b/93a332b8d25b347f6839ca0a61b7f0287b0930216994e8bf67a75d050255/charset_normalizer-3.4.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:6b40e8d38afe634559e398cc32b1472f376a4099c75fe6299ae607e404c033b2", size = 140043 }, + { url = "https://files.pythonhosted.org/packages/ab/f6/7ac4a01adcdecbc7a7587767c776d53d369b8b971382b91211489535acf0/charset_normalizer-3.4.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:b8dcd239c743aa2f9c22ce674a145e0a25cb1566c495928440a181ca1ccf6719", size = 148229 }, + { url = "https://files.pythonhosted.org/packages/9d/be/5708ad18161dee7dc6a0f7e6cf3a88ea6279c3e8484844c0590e50e803ef/charset_normalizer-3.4.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:84450ba661fb96e9fd67629b93d2941c871ca86fc38d835d19d4225ff946a631", size = 151556 }, + { url = "https://files.pythonhosted.org/packages/5a/bb/3d8bc22bacb9eb89785e83e6723f9888265f3a0de3b9ce724d66bd49884e/charset_normalizer-3.4.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:44aeb140295a2f0659e113b31cfe92c9061622cadbc9e2a2f7b8ef6b1e29ef4b", size = 149772 }, + { url = "https://files.pythonhosted.org/packages/f7/fa/d3fc622de05a86f30beea5fc4e9ac46aead4731e73fd9055496732bcc0a4/charset_normalizer-3.4.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:1db4e7fefefd0f548d73e2e2e041f9df5c59e178b4c72fbac4cc6f535cfb1565", size = 144800 }, + { url = "https://files.pythonhosted.org/packages/9a/65/bdb9bc496d7d190d725e96816e20e2ae3a6fa42a5cac99c3c3d6ff884118/charset_normalizer-3.4.0-cp312-cp312-win32.whl", hash = "sha256:5726cf76c982532c1863fb64d8c6dd0e4c90b6ece9feb06c9f202417a31f7dd7", size = 94836 }, + { url = "https://files.pythonhosted.org/packages/3e/67/7b72b69d25b89c0b3cea583ee372c43aa24df15f0e0f8d3982c57804984b/charset_normalizer-3.4.0-cp312-cp312-win_amd64.whl", hash = "sha256:b197e7094f232959f8f20541ead1d9862ac5ebea1d58e9849c1bf979255dfac9", size = 102187 }, + { url = "https://files.pythonhosted.org/packages/f3/89/68a4c86f1a0002810a27f12e9a7b22feb198c59b2f05231349fbce5c06f4/charset_normalizer-3.4.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:dd4eda173a9fcccb5f2e2bd2a9f423d180194b1bf17cf59e3269899235b2a114", size = 194617 }, + { url = "https://files.pythonhosted.org/packages/4f/cd/8947fe425e2ab0aa57aceb7807af13a0e4162cd21eee42ef5b053447edf5/charset_normalizer-3.4.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:e9e3c4c9e1ed40ea53acf11e2a386383c3304212c965773704e4603d589343ed", size = 125310 }, + { url = "https://files.pythonhosted.org/packages/5b/f0/b5263e8668a4ee9becc2b451ed909e9c27058337fda5b8c49588183c267a/charset_normalizer-3.4.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:92a7e36b000bf022ef3dbb9c46bfe2d52c047d5e3f3343f43204263c5addc250", size = 119126 }, + { url = "https://files.pythonhosted.org/packages/ff/6e/e445afe4f7fda27a533f3234b627b3e515a1b9429bc981c9a5e2aa5d97b6/charset_normalizer-3.4.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:54b6a92d009cbe2fb11054ba694bc9e284dad30a26757b1e372a1fdddaf21920", size = 139342 }, + { url = "https://files.pythonhosted.org/packages/a1/b2/4af9993b532d93270538ad4926c8e37dc29f2111c36f9c629840c57cd9b3/charset_normalizer-3.4.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ffd9493de4c922f2a38c2bf62b831dcec90ac673ed1ca182fe11b4d8e9f2a64", size = 149383 }, + { url = "https://files.pythonhosted.org/packages/fb/6f/4e78c3b97686b871db9be6f31d64e9264e889f8c9d7ab33c771f847f79b7/charset_normalizer-3.4.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:35c404d74c2926d0287fbd63ed5d27eb911eb9e4a3bb2c6d294f3cfd4a9e0c23", size = 142214 }, + { url = "https://files.pythonhosted.org/packages/2b/c9/1c8fe3ce05d30c87eff498592c89015b19fade13df42850aafae09e94f35/charset_normalizer-3.4.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4796efc4faf6b53a18e3d46343535caed491776a22af773f366534056c4e1fbc", size = 144104 }, + { url = "https://files.pythonhosted.org/packages/ee/68/efad5dcb306bf37db7db338338e7bb8ebd8cf38ee5bbd5ceaaaa46f257e6/charset_normalizer-3.4.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e7fdd52961feb4c96507aa649550ec2a0d527c086d284749b2f582f2d40a2e0d", size = 146255 }, + { url = "https://files.pythonhosted.org/packages/0c/75/1ed813c3ffd200b1f3e71121c95da3f79e6d2a96120163443b3ad1057505/charset_normalizer-3.4.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:92db3c28b5b2a273346bebb24857fda45601aef6ae1c011c0a997106581e8a88", size = 140251 }, + { url = "https://files.pythonhosted.org/packages/7d/0d/6f32255c1979653b448d3c709583557a4d24ff97ac4f3a5be156b2e6a210/charset_normalizer-3.4.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:ab973df98fc99ab39080bfb0eb3a925181454d7c3ac8a1e695fddfae696d9e90", size = 148474 }, + { url = "https://files.pythonhosted.org/packages/ac/a0/c1b5298de4670d997101fef95b97ac440e8c8d8b4efa5a4d1ef44af82f0d/charset_normalizer-3.4.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:4b67fdab07fdd3c10bb21edab3cbfe8cf5696f453afce75d815d9d7223fbe88b", size = 151849 }, + { url = "https://files.pythonhosted.org/packages/04/4f/b3961ba0c664989ba63e30595a3ed0875d6790ff26671e2aae2fdc28a399/charset_normalizer-3.4.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:aa41e526a5d4a9dfcfbab0716c7e8a1b215abd3f3df5a45cf18a12721d31cb5d", size = 149781 }, + { url = "https://files.pythonhosted.org/packages/d8/90/6af4cd042066a4adad58ae25648a12c09c879efa4849c705719ba1b23d8c/charset_normalizer-3.4.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ffc519621dce0c767e96b9c53f09c5d215578e10b02c285809f76509a3931482", size = 144970 }, + { url = "https://files.pythonhosted.org/packages/cc/67/e5e7e0cbfefc4ca79025238b43cdf8a2037854195b37d6417f3d0895c4c2/charset_normalizer-3.4.0-cp313-cp313-win32.whl", hash = "sha256:f19c1585933c82098c2a520f8ec1227f20e339e33aca8fa6f956f6691b784e67", size = 94973 }, + { url = "https://files.pythonhosted.org/packages/65/97/fc9bbc54ee13d33dc54a7fcf17b26368b18505500fc01e228c27b5222d80/charset_normalizer-3.4.0-cp313-cp313-win_amd64.whl", hash = "sha256:707b82d19e65c9bd28b81dde95249b07bf9f5b90ebe1ef17d9b57473f8a64b7b", size = 102308 }, + { url = "https://files.pythonhosted.org/packages/86/f4/ccab93e631e7293cca82f9f7ba39783c967f823a0000df2d8dd743cad74f/charset_normalizer-3.4.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:af73657b7a68211996527dbfeffbb0864e043d270580c5aef06dc4b659a4b578", size = 193961 }, + { url = "https://files.pythonhosted.org/packages/94/d4/2b21cb277bac9605026d2d91a4a8872bc82199ed11072d035dc674c27223/charset_normalizer-3.4.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:cab5d0b79d987c67f3b9e9c53f54a61360422a5a0bc075f43cab5621d530c3b6", size = 124507 }, + { url = "https://files.pythonhosted.org/packages/9a/e0/a7c1fcdff20d9c667342e0391cfeb33ab01468d7d276b2c7914b371667cc/charset_normalizer-3.4.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:9289fd5dddcf57bab41d044f1756550f9e7cf0c8e373b8cdf0ce8773dc4bd417", size = 119298 }, + { url = "https://files.pythonhosted.org/packages/70/de/1538bb2f84ac9940f7fa39945a5dd1d22b295a89c98240b262fc4b9fcfe0/charset_normalizer-3.4.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6b493a043635eb376e50eedf7818f2f322eabbaa974e948bd8bdd29eb7ef2a51", size = 139328 }, + { url = "https://files.pythonhosted.org/packages/e9/ca/288bb1a6bc2b74fb3990bdc515012b47c4bc5925c8304fc915d03f94b027/charset_normalizer-3.4.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9fa2566ca27d67c86569e8c85297aaf413ffab85a8960500f12ea34ff98e4c41", size = 149368 }, + { url = "https://files.pythonhosted.org/packages/aa/75/58374fdaaf8406f373e508dab3486a31091f760f99f832d3951ee93313e8/charset_normalizer-3.4.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a8e538f46104c815be19c975572d74afb53f29650ea2025bbfaef359d2de2f7f", size = 141944 }, + { url = "https://files.pythonhosted.org/packages/32/c8/0bc558f7260db6ffca991ed7166494a7da4fda5983ee0b0bfc8ed2ac6ff9/charset_normalizer-3.4.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6fd30dc99682dc2c603c2b315bded2799019cea829f8bf57dc6b61efde6611c8", size = 143326 }, + { url = "https://files.pythonhosted.org/packages/0e/dd/7f6fec09a1686446cee713f38cf7d5e0669e0bcc8288c8e2924e998cf87d/charset_normalizer-3.4.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2006769bd1640bdf4d5641c69a3d63b71b81445473cac5ded39740a226fa88ab", size = 146171 }, + { url = "https://files.pythonhosted.org/packages/4c/a8/440f1926d6d8740c34d3ca388fbd718191ec97d3d457a0677eb3aa718fce/charset_normalizer-3.4.0-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:dc15e99b2d8a656f8e666854404f1ba54765871104e50c8e9813af8a7db07f12", size = 139711 }, + { url = "https://files.pythonhosted.org/packages/e9/7f/4b71e350a3377ddd70b980bea1e2cc0983faf45ba43032b24b2578c14314/charset_normalizer-3.4.0-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:ab2e5bef076f5a235c3774b4f4028a680432cded7cad37bba0fd90d64b187d19", size = 148348 }, + { url = "https://files.pythonhosted.org/packages/1e/70/17b1b9202531a33ed7ef41885f0d2575ae42a1e330c67fddda5d99ad1208/charset_normalizer-3.4.0-cp38-cp38-musllinux_1_2_ppc64le.whl", hash = "sha256:4ec9dd88a5b71abfc74e9df5ebe7921c35cbb3b641181a531ca65cdb5e8e4dea", size = 151290 }, + { url = "https://files.pythonhosted.org/packages/44/30/574b5b5933d77ecb015550aafe1c7d14a8cd41e7e6c4dcea5ae9e8d496c3/charset_normalizer-3.4.0-cp38-cp38-musllinux_1_2_s390x.whl", hash = "sha256:43193c5cda5d612f247172016c4bb71251c784d7a4d9314677186a838ad34858", size = 149114 }, + { url = "https://files.pythonhosted.org/packages/0b/11/ca7786f7e13708687443082af20d8341c02e01024275a28bc75032c5ce5d/charset_normalizer-3.4.0-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:aa693779a8b50cd97570e5a0f343538a8dbd3e496fa5dcb87e29406ad0299654", size = 143856 }, + { url = "https://files.pythonhosted.org/packages/f9/c2/1727c1438256c71ed32753b23ec2e6fe7b6dff66a598f6566cfe8139305e/charset_normalizer-3.4.0-cp38-cp38-win32.whl", hash = "sha256:7706f5850360ac01d80c89bcef1640683cc12ed87f42579dab6c5d3ed6888613", size = 94333 }, + { url = "https://files.pythonhosted.org/packages/09/c8/0e17270496a05839f8b500c1166e3261d1226e39b698a735805ec206967b/charset_normalizer-3.4.0-cp38-cp38-win_amd64.whl", hash = "sha256:c3e446d253bd88f6377260d07c895816ebf33ffffd56c1c792b13bff9c3e1ade", size = 101454 }, + { url = "https://files.pythonhosted.org/packages/54/2f/28659eee7f5d003e0f5a3b572765bf76d6e0fe6601ab1f1b1dd4cba7e4f1/charset_normalizer-3.4.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:980b4f289d1d90ca5efcf07958d3eb38ed9c0b7676bf2831a54d4f66f9c27dfa", size = 196326 }, + { url = "https://files.pythonhosted.org/packages/d1/18/92869d5c0057baa973a3ee2af71573be7b084b3c3d428fe6463ce71167f8/charset_normalizer-3.4.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:f28f891ccd15c514a0981f3b9db9aa23d62fe1a99997512b0491d2ed323d229a", size = 125614 }, + { url = "https://files.pythonhosted.org/packages/d6/27/327904c5a54a7796bb9f36810ec4173d2df5d88b401d2b95ef53111d214e/charset_normalizer-3.4.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a8aacce6e2e1edcb6ac625fb0f8c3a9570ccc7bfba1f63419b3769ccf6a00ed0", size = 120450 }, + { url = "https://files.pythonhosted.org/packages/a4/23/65af317914a0308495133b2d654cf67b11bbd6ca16637c4e8a38f80a5a69/charset_normalizer-3.4.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bd7af3717683bea4c87acd8c0d3d5b44d56120b26fd3f8a692bdd2d5260c620a", size = 140135 }, + { url = "https://files.pythonhosted.org/packages/f2/41/6190102ad521a8aa888519bb014a74251ac4586cde9b38e790901684f9ab/charset_normalizer-3.4.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5ff2ed8194587faf56555927b3aa10e6fb69d931e33953943bc4f837dfee2242", size = 150413 }, + { url = "https://files.pythonhosted.org/packages/7b/ab/f47b0159a69eab9bd915591106859f49670c75f9a19082505ff16f50efc0/charset_normalizer-3.4.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e91f541a85298cf35433bf66f3fab2a4a2cff05c127eeca4af174f6d497f0d4b", size = 142992 }, + { url = "https://files.pythonhosted.org/packages/28/89/60f51ad71f63aaaa7e51a2a2ad37919985a341a1d267070f212cdf6c2d22/charset_normalizer-3.4.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:309a7de0a0ff3040acaebb35ec45d18db4b28232f21998851cfa709eeff49d62", size = 144871 }, + { url = "https://files.pythonhosted.org/packages/0c/48/0050550275fea585a6e24460b42465020b53375017d8596c96be57bfabca/charset_normalizer-3.4.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:285e96d9d53422efc0d7a17c60e59f37fbf3dfa942073f666db4ac71e8d726d0", size = 146756 }, + { url = "https://files.pythonhosted.org/packages/dc/b5/47f8ee91455946f745e6c9ddbb0f8f50314d2416dd922b213e7d5551ad09/charset_normalizer-3.4.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:5d447056e2ca60382d460a604b6302d8db69476fd2015c81e7c35417cfabe4cd", size = 141034 }, + { url = "https://files.pythonhosted.org/packages/84/79/5c731059ebab43e80bf61fa51666b9b18167974b82004f18c76378ed31a3/charset_normalizer-3.4.0-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:20587d20f557fe189b7947d8e7ec5afa110ccf72a3128d61a2a387c3313f46be", size = 149434 }, + { url = "https://files.pythonhosted.org/packages/ca/f3/0719cd09fc4dc42066f239cb3c48ced17fc3316afca3e2a30a4756fe49ab/charset_normalizer-3.4.0-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:130272c698667a982a5d0e626851ceff662565379baf0ff2cc58067b81d4f11d", size = 152443 }, + { url = "https://files.pythonhosted.org/packages/f7/0e/c6357297f1157c8e8227ff337e93fd0a90e498e3d6ab96b2782204ecae48/charset_normalizer-3.4.0-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:ab22fbd9765e6954bc0bcff24c25ff71dcbfdb185fcdaca49e81bac68fe724d3", size = 150294 }, + { url = "https://files.pythonhosted.org/packages/54/9a/acfa96dc4ea8c928040b15822b59d0863d6e1757fba8bd7de3dc4f761c13/charset_normalizer-3.4.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:7782afc9b6b42200f7362858f9e73b1f8316afb276d316336c0ec3bd73312742", size = 145314 }, + { url = "https://files.pythonhosted.org/packages/73/1c/b10a63032eaebb8d7bcb8544f12f063f41f5f463778ac61da15d9985e8b6/charset_normalizer-3.4.0-cp39-cp39-win32.whl", hash = "sha256:2de62e8801ddfff069cd5c504ce3bc9672b23266597d4e4f50eda28846c322f2", size = 94724 }, + { url = "https://files.pythonhosted.org/packages/c5/77/3a78bf28bfaa0863f9cfef278dbeadf55efe064eafff8c7c424ae3c4c1bf/charset_normalizer-3.4.0-cp39-cp39-win_amd64.whl", hash = "sha256:95c3c157765b031331dd4db3c775e58deaee050a3042fcad72cbc4189d7c8dca", size = 102159 }, + { url = "https://files.pythonhosted.org/packages/bf/9b/08c0432272d77b04803958a4598a51e2a4b51c06640af8b8f0f908c18bf2/charset_normalizer-3.4.0-py3-none-any.whl", hash = "sha256:fe9f97feb71aa9896b81973a7bbada8c49501dc73e58a10fcef6663af95e5079", size = 49446 }, +] + +[[package]] +name = "click" +version = "8.1.7" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "colorama", marker = "platform_system == 'Windows'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/96/d3/f04c7bfcf5c1862a2a5b845c6b2b360488cf47af55dfa79c98f6a6bf98b5/click-8.1.7.tar.gz", hash = "sha256:ca9853ad459e787e2192211578cc907e7594e294c7ccc834310722b41b9ca6de", size = 336121 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/00/2e/d53fa4befbf2cfa713304affc7ca780ce4fc1fd8710527771b58311a3229/click-8.1.7-py3-none-any.whl", hash = "sha256:ae74fb96c20a0277a1d615f1e4d73c8414f5a98db8b799a7931d1582f3390c28", size = 97941 }, +] + +[[package]] +name = "colorama" +version = "0.4.6" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335 }, +] + +[[package]] +name = "cryptography" +version = "43.0.3" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "cffi", marker = "platform_python_implementation != 'PyPy'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/0d/05/07b55d1fa21ac18c3a8c79f764e2514e6f6a9698f1be44994f5adf0d29db/cryptography-43.0.3.tar.gz", hash = "sha256:315b9001266a492a6ff443b61238f956b214dbec9910a081ba5b6646a055a805", size = 686989 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/1f/f3/01fdf26701a26f4b4dbc337a26883ad5bccaa6f1bbbdd29cd89e22f18a1c/cryptography-43.0.3-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:bf7a1932ac4176486eab36a19ed4c0492da5d97123f1406cf15e41b05e787d2e", size = 6225303 }, + { url = "https://files.pythonhosted.org/packages/a3/01/4896f3d1b392025d4fcbecf40fdea92d3df8662123f6835d0af828d148fd/cryptography-43.0.3-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:63efa177ff54aec6e1c0aefaa1a241232dcd37413835a9b674b6e3f0ae2bfd3e", size = 3760905 }, + { url = "https://files.pythonhosted.org/packages/0a/be/f9a1f673f0ed4b7f6c643164e513dbad28dd4f2dcdf5715004f172ef24b6/cryptography-43.0.3-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e1ce50266f4f70bf41a2c6dc4358afadae90e2a1e5342d3c08883df1675374f", size = 3977271 }, + { url = "https://files.pythonhosted.org/packages/4e/49/80c3a7b5514d1b416d7350830e8c422a4d667b6d9b16a9392ebfd4a5388a/cryptography-43.0.3-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:443c4a81bb10daed9a8f334365fe52542771f25aedaf889fd323a853ce7377d6", size = 3746606 }, + { url = "https://files.pythonhosted.org/packages/0e/16/a28ddf78ac6e7e3f25ebcef69ab15c2c6be5ff9743dd0709a69a4f968472/cryptography-43.0.3-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:74f57f24754fe349223792466a709f8e0c093205ff0dca557af51072ff47ab18", size = 3986484 }, + { url = "https://files.pythonhosted.org/packages/01/f5/69ae8da70c19864a32b0315049866c4d411cce423ec169993d0434218762/cryptography-43.0.3-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:9762ea51a8fc2a88b70cf2995e5675b38d93bf36bd67d91721c309df184f49bd", size = 3852131 }, + { url = "https://files.pythonhosted.org/packages/fd/db/e74911d95c040f9afd3612b1f732e52b3e517cb80de8bf183be0b7d413c6/cryptography-43.0.3-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:81ef806b1fef6b06dcebad789f988d3b37ccaee225695cf3e07648eee0fc6b73", size = 4075647 }, + { url = "https://files.pythonhosted.org/packages/56/48/7b6b190f1462818b324e674fa20d1d5ef3e24f2328675b9b16189cbf0b3c/cryptography-43.0.3-cp37-abi3-win32.whl", hash = "sha256:cbeb489927bd7af4aa98d4b261af9a5bc025bd87f0e3547e11584be9e9427be2", size = 2623873 }, + { url = "https://files.pythonhosted.org/packages/eb/b1/0ebff61a004f7f89e7b65ca95f2f2375679d43d0290672f7713ee3162aff/cryptography-43.0.3-cp37-abi3-win_amd64.whl", hash = "sha256:f46304d6f0c6ab8e52770addfa2fc41e6629495548862279641972b6215451cd", size = 3068039 }, + { url = "https://files.pythonhosted.org/packages/30/d5/c8b32c047e2e81dd172138f772e81d852c51f0f2ad2ae8a24f1122e9e9a7/cryptography-43.0.3-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:8ac43ae87929a5982f5948ceda07001ee5e83227fd69cf55b109144938d96984", size = 6222984 }, + { url = "https://files.pythonhosted.org/packages/2f/78/55356eb9075d0be6e81b59f45c7b48df87f76a20e73893872170471f3ee8/cryptography-43.0.3-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:846da004a5804145a5f441b8530b4bf35afbf7da70f82409f151695b127213d5", size = 3762968 }, + { url = "https://files.pythonhosted.org/packages/2a/2c/488776a3dc843f95f86d2f957ca0fc3407d0242b50bede7fad1e339be03f/cryptography-43.0.3-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0f996e7268af62598f2fc1204afa98a3b5712313a55c4c9d434aef49cadc91d4", size = 3977754 }, + { url = "https://files.pythonhosted.org/packages/7c/04/2345ca92f7a22f601a9c62961741ef7dd0127c39f7310dffa0041c80f16f/cryptography-43.0.3-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:f7b178f11ed3664fd0e995a47ed2b5ff0a12d893e41dd0494f406d1cf555cab7", size = 3749458 }, + { url = "https://files.pythonhosted.org/packages/ac/25/e715fa0bc24ac2114ed69da33adf451a38abb6f3f24ec207908112e9ba53/cryptography-43.0.3-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:c2e6fc39c4ab499049df3bdf567f768a723a5e8464816e8f009f121a5a9f4405", size = 3988220 }, + { url = "https://files.pythonhosted.org/packages/21/ce/b9c9ff56c7164d8e2edfb6c9305045fbc0df4508ccfdb13ee66eb8c95b0e/cryptography-43.0.3-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:e1be4655c7ef6e1bbe6b5d0403526601323420bcf414598955968c9ef3eb7d16", size = 3853898 }, + { url = "https://files.pythonhosted.org/packages/2a/33/b3682992ab2e9476b9c81fff22f02c8b0a1e6e1d49ee1750a67d85fd7ed2/cryptography-43.0.3-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:df6b6c6d742395dd77a23ea3728ab62f98379eff8fb61be2744d4679ab678f73", size = 4076592 }, + { url = "https://files.pythonhosted.org/packages/81/1e/ffcc41b3cebd64ca90b28fd58141c5f68c83d48563c88333ab660e002cd3/cryptography-43.0.3-cp39-abi3-win32.whl", hash = "sha256:d56e96520b1020449bbace2b78b603442e7e378a9b3bd68de65c782db1507995", size = 2623145 }, + { url = "https://files.pythonhosted.org/packages/87/5c/3dab83cc4aba1f4b0e733e3f0c3e7d4386440d660ba5b1e3ff995feb734d/cryptography-43.0.3-cp39-abi3-win_amd64.whl", hash = "sha256:0c580952eef9bf68c4747774cde7ec1d85a6e61de97281f2dba83c7d2c806362", size = 3068026 }, + { url = "https://files.pythonhosted.org/packages/6f/db/d8b8a039483f25fc3b70c90bc8f3e1d4497a99358d610c5067bf3bd4f0af/cryptography-43.0.3-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:d03b5621a135bffecad2c73e9f4deb1a0f977b9a8ffe6f8e002bf6c9d07b918c", size = 3144545 }, + { url = "https://files.pythonhosted.org/packages/93/90/116edd5f8ec23b2dc879f7a42443e073cdad22950d3c8ee834e3b8124543/cryptography-43.0.3-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:a2a431ee15799d6db9fe80c82b055bae5a752bef645bba795e8e52687c69efe3", size = 3679828 }, + { url = "https://files.pythonhosted.org/packages/d8/32/1e1d78b316aa22c0ba6493cc271c1c309969e5aa5c22c830a1d7ce3471e6/cryptography-43.0.3-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:281c945d0e28c92ca5e5930664c1cefd85efe80e5c0d2bc58dd63383fda29f83", size = 3908132 }, + { url = "https://files.pythonhosted.org/packages/91/bb/cd2c13be3332e7af3cdf16154147952d39075b9f61ea5e6b5241bf4bf436/cryptography-43.0.3-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:f18c716be16bc1fea8e95def49edf46b82fccaa88587a45f8dc0ff6ab5d8e0a7", size = 2988811 }, + { url = "https://files.pythonhosted.org/packages/cc/fc/ff7c76afdc4f5933b5e99092528d4783d3d1b131960fc8b31eb38e076ca8/cryptography-43.0.3-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:4a02ded6cd4f0a5562a8887df8b3bd14e822a90f97ac5e544c162899bc467664", size = 3146844 }, + { url = "https://files.pythonhosted.org/packages/d7/29/a233efb3e98b13d9175dcb3c3146988ec990896c8fa07e8467cce27d5a80/cryptography-43.0.3-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:53a583b6637ab4c4e3591a15bc9db855b8d9dee9a669b550f311480acab6eb08", size = 3681997 }, + { url = "https://files.pythonhosted.org/packages/c0/cf/c9eea7791b961f279fb6db86c3355cfad29a73141f46427af71852b23b95/cryptography-43.0.3-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:1ec0bcf7e17c0c5669d881b1cd38c4972fade441b27bda1051665faaa89bdcaa", size = 3905208 }, + { url = "https://files.pythonhosted.org/packages/21/ea/6c38ca546d5b6dab3874c2b8fc6b1739baac29bacdea31a8c6c0513b3cfa/cryptography-43.0.3-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:2ce6fae5bdad59577b44e4dfed356944fbf1d925269114c28be377692643b4ff", size = 2989787 }, +] + +[[package]] +name = "deprecated" +version = "1.2.14" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "wrapt" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/92/14/1e41f504a246fc224d2ac264c227975427a85caf37c3979979edb9b1b232/Deprecated-1.2.14.tar.gz", hash = "sha256:e5323eb936458dccc2582dc6f9c322c852a775a27065ff2b0c4970b9d53d01b3", size = 2974416 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/20/8d/778b7d51b981a96554f29136cd59ca7880bf58094338085bcf2a979a0e6a/Deprecated-1.2.14-py2.py3-none-any.whl", hash = "sha256:6fac8b097794a90302bdbb17b9b815e732d3c4720583ff1b198499d78470466c", size = 9561 }, +] + +[[package]] +name = "distlib" +version = "0.3.9" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/0d/dd/1bec4c5ddb504ca60fc29472f3d27e8d4da1257a854e1d96742f15c1d02d/distlib-0.3.9.tar.gz", hash = "sha256:a60f20dea646b8a33f3e7772f74dc0b2d0772d2837ee1342a00645c81edf9403", size = 613923 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/91/a1/cf2472db20f7ce4a6be1253a81cfdf85ad9c7885ffbed7047fb72c24cf87/distlib-0.3.9-py2.py3-none-any.whl", hash = "sha256:47f8c22fd27c27e25a65601af709b38e4f0a45ea4fc2e710f65755fa8caaaf87", size = 468973 }, +] + +[[package]] +name = "docutils" +version = "0.20.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/1f/53/a5da4f2c5739cf66290fac1431ee52aff6851c7c8ffd8264f13affd7bcdd/docutils-0.20.1.tar.gz", hash = "sha256:f08a4e276c3a1583a86dce3e34aba3fe04d02bba2dd51ed16106244e8a923e3b", size = 2058365 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/26/87/f238c0670b94533ac0353a4e2a1a771a0cc73277b88bff23d3ae35a256c1/docutils-0.20.1-py3-none-any.whl", hash = "sha256:96f387a2c5562db4476f09f13bbab2192e764cac08ebbf3a34a95d9b1e4a59d6", size = 572666 }, +] + +[[package]] +name = "editables" +version = "0.5" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/37/4a/986d35164e2033ddfb44515168a281a7986e260d344cf369c3f52d4c3275/editables-0.5.tar.gz", hash = "sha256:309627d9b5c4adc0e668d8c6fa7bac1ba7c8c5d415c2d27f60f081f8e80d1de2", size = 14744 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/6b/be/0f2f4a5e8adc114a02b63d92bf8edbfa24db6fc602fca83c885af2479e0e/editables-0.5-py3-none-any.whl", hash = "sha256:61e5ffa82629e0d8bfe09bc44a07db3c1ab8ed1ce78a6980732870f19b5e7d4c", size = 5098 }, +] + +[[package]] +name = "exceptiongroup" +version = "1.2.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/09/35/2495c4ac46b980e4ca1f6ad6db102322ef3ad2410b79fdde159a4b0f3b92/exceptiongroup-1.2.2.tar.gz", hash = "sha256:47c2edf7c6738fafb49fd34290706d1a1a2f4d1c6df275526b62cbb4aa5393cc", size = 28883 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/02/cc/b7e31358aac6ed1ef2bb790a9746ac2c69bcb3c8588b41616914eb106eaf/exceptiongroup-1.2.2-py3-none-any.whl", hash = "sha256:3111b9d131c238bec2f8f516e123e14ba243563fb135d3fe885990585aa7795b", size = 16453 }, +] + +[[package]] +name = "execnet" +version = "2.1.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/bb/ff/b4c0dc78fbe20c3e59c0c7334de0c27eb4001a2b2017999af398bf730817/execnet-2.1.1.tar.gz", hash = "sha256:5189b52c6121c24feae288166ab41b32549c7e2348652736540b9e6e7d4e72e3", size = 166524 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/43/09/2aea36ff60d16dd8879bdb2f5b3ee0ba8d08cbbdcdfe870e695ce3784385/execnet-2.1.1-py3-none-any.whl", hash = "sha256:26dee51f1b80cebd6d0ca8e74dd8745419761d3bef34163928cbebbdc4749fdc", size = 40612 }, +] + +[[package]] +name = "filelock" +version = "3.16.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/9d/db/3ef5bb276dae18d6ec2124224403d1d67bccdbefc17af4cc8f553e341ab1/filelock-3.16.1.tar.gz", hash = "sha256:c249fbfcd5db47e5e2d6d62198e565475ee65e4831e2561c8e313fa7eb961435", size = 18037 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b9/f8/feced7779d755758a52d1f6635d990b8d98dc0a29fa568bbe0625f18fdf3/filelock-3.16.1-py3-none-any.whl", hash = "sha256:2082e5703d51fbf98ea75855d9d5527e33d8ff23099bec374a134febee6946b0", size = 16163 }, +] + +[[package]] +name = "flit" +version = "3.10.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "docutils" }, + { name = "flit-core" }, + { name = "pip" }, + { name = "requests" }, + { name = "tomli-w" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/03/46/f84b8815d161e7392d124d3de6e5880d1d36a74162a77a5e2839dc3c8c68/flit-3.10.1.tar.gz", hash = "sha256:9c6258ae76d218ce60f9e39a43ca42006a3abcc5c44ea6bb2a1daa13857a8f1a", size = 143162 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b6/ba/d88b8f3253b4af5a88559aede6345975cc2b18ed77bf8daf977bbb9df2c5/flit-3.10.1-py3-none-any.whl", hash = "sha256:d79c19c2caae73cc486d3d827af6a11c1a84b9efdfab8d9683b714ec8d1dc1f1", size = 50683 }, +] + +[[package]] +name = "flit-core" +version = "3.10.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/d5/ae/09427bea9227a33ec834ed5461432752fd5d02b14f93dd68406c91684622/flit_core-3.10.1.tar.gz", hash = "sha256:66e5b87874a0d6e39691f0e22f09306736b633548670ad3c09ec9db03c5662f7", size = 42842 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/be/2d/293fe6a58e73df57cc2b5e5cf2b17c6bb4fb5b0c390bab8f1e87bdc62529/flit_core-3.10.1-py3-none-any.whl", hash = "sha256:cb31a76e8b31ad3351bb89e531f64ef2b05d1e65bd939183250bf81ddf4922a8", size = 36389 }, +] + +[[package]] +name = "gitdb" +version = "4.0.11" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "smmap" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/19/0d/bbb5b5ee188dec84647a4664f3e11b06ade2bde568dbd489d9d64adef8ed/gitdb-4.0.11.tar.gz", hash = "sha256:bf5421126136d6d0af55bc1e7c1af1c397a34f5b7bd79e776cd3e89785c2b04b", size = 394469 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/fd/5b/8f0c4a5bb9fd491c277c21eff7ccae71b47d43c4446c9d0c6cff2fe8c2c4/gitdb-4.0.11-py3-none-any.whl", hash = "sha256:81a3407ddd2ee8df444cbacea00e2d038e40150acfa3001696fe0dcf1d3adfa4", size = 62721 }, +] + +[[package]] +name = "gitpython" +version = "3.1.43" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "gitdb" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/b6/a1/106fd9fa2dd989b6fb36e5893961f82992cf676381707253e0bf93eb1662/GitPython-3.1.43.tar.gz", hash = "sha256:35f314a9f878467f5453cc1fee295c3e18e52f1b99f10f6cf5b1682e968a9e7c", size = 214149 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e9/bd/cc3a402a6439c15c3d4294333e13042b915bbeab54edc457c723931fed3f/GitPython-3.1.43-py3-none-any.whl", hash = "sha256:eec7ec56b92aad751f9912a73404bc02ba212a23adb2c7098ee668417051a1ff", size = 207337 }, +] + +[[package]] +name = "h11" +version = "0.14.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f5/38/3af3d3633a34a3316095b39c8e8fb4853a28a536e55d347bd8d8e9a14b03/h11-0.14.0.tar.gz", hash = "sha256:8f19fbbe99e72420ff35c00b27a34cb9937e902a8b810e2c88300c6f0a3b699d", size = 100418 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/95/04/ff642e65ad6b90db43e668d70ffb6736436c7ce41fcc549f4e9472234127/h11-0.14.0-py3-none-any.whl", hash = "sha256:e3fe4ac4b851c468cc8363d500db52c2ead036020723024a109d37346efaa761", size = 58259 }, +] + +[[package]] +name = "hatch" +version = "1.9.4" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "click" }, + { name = "hatchling" }, + { name = "httpx" }, + { name = "hyperlink" }, + { name = "keyring" }, + { name = "packaging" }, + { name = "pexpect" }, + { name = "platformdirs" }, + { name = "rich" }, + { name = "shellingham" }, + { name = "tomli-w" }, + { name = "tomlkit" }, + { name = "userpath" }, + { name = "virtualenv" }, + { name = "zstandard" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/17/98/63bf6c592b65f67201db292489053b86310cfb107eb095d345398e00cbd3/hatch-1.9.4.tar.gz", hash = "sha256:9bb7d1c4a7a51cc1f9e16394875c940b45fa84b698f0291529316b27d74e7f32", size = 689598 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/05/38/ba8f90264d19ed39851f37a22f2a4be8e9644a1203f114b16647f954bb02/hatch-1.9.4-py3-none-any.whl", hash = "sha256:461eb86b4b46249e38a9a621c7239e61285fd8e14b5a1b5a727c394893a25300", size = 110812 }, +] + +[[package]] +name = "hatchling" +version = "1.21.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "editables" }, + { name = "packaging" }, + { name = "pathspec" }, + { name = "pluggy" }, + { name = "tomli", marker = "python_full_version < '3.11'" }, + { name = "trove-classifiers" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/d8/a1/7dd1caa87c0b15c04c6291e25112e5d082cce02ee87f221a8be1d594f857/hatchling-1.21.1.tar.gz", hash = "sha256:bba440453a224e7d4478457fa2e8d8c3633765bafa02975a6b53b9bf917980bc", size = 58059 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/3a/bb/40528a09a33845bd7fd75c33b3be7faec3b5c8f15f68a58931da67420fb9/hatchling-1.21.1-py3-none-any.whl", hash = "sha256:21e8c13f8458b219a91cb84e5b61c15bf786695d1c4fabc29e91e78f94bfe892", size = 76740 }, +] + +[[package]] +name = "httpcore" +version = "1.0.6" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "certifi" }, + { name = "h11" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/b6/44/ed0fa6a17845fb033bd885c03e842f08c1b9406c86a2e60ac1ae1b9206a6/httpcore-1.0.6.tar.gz", hash = "sha256:73f6dbd6eb8c21bbf7ef8efad555481853f5f6acdeaff1edb0694289269ee17f", size = 85180 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/06/89/b161908e2f51be56568184aeb4a880fd287178d176fd1c860d2217f41106/httpcore-1.0.6-py3-none-any.whl", hash = "sha256:27b59625743b85577a8c0e10e55b50b5368a4f2cfe8cc7bcfa9cf00829c2682f", size = 78011 }, +] + +[[package]] +name = "httpx" +version = "0.27.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio" }, + { name = "certifi" }, + { name = "httpcore" }, + { name = "idna" }, + { name = "sniffio" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/78/82/08f8c936781f67d9e6b9eeb8a0c8b4e406136ea4c3d1f89a5db71d42e0e6/httpx-0.27.2.tar.gz", hash = "sha256:f7c2be1d2f3c3c3160d441802406b206c2b76f5947b11115e6df10c6c65e66c2", size = 144189 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/56/95/9377bcb415797e44274b51d46e3249eba641711cf3348050f76ee7b15ffc/httpx-0.27.2-py3-none-any.whl", hash = "sha256:7bb2708e112d8fdd7829cd4243970f0c223274051cb35ee80c03301ee29a3df0", size = 76395 }, +] + +[[package]] +name = "hyperlink" +version = "21.0.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "idna" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/3a/51/1947bd81d75af87e3bb9e34593a4cf118115a8feb451ce7a69044ef1412e/hyperlink-21.0.0.tar.gz", hash = "sha256:427af957daa58bc909471c6c40f74c5450fa123dd093fc53efd2e91d2705a56b", size = 140743 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/6e/aa/8caf6a0a3e62863cbb9dab27135660acba46903b703e224f14f447e57934/hyperlink-21.0.0-py2.py3-none-any.whl", hash = "sha256:e6b14c37ecb73e89c77d78cdb4c2cc8f3fb59a885c5b3f819ff4ed80f25af1b4", size = 74638 }, +] + +[[package]] +name = "identify" +version = "2.6.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/29/bb/25024dbcc93516c492b75919e76f389bac754a3e4248682fba32b250c880/identify-2.6.1.tar.gz", hash = "sha256:91478c5fb7c3aac5ff7bf9b4344f803843dc586832d5f110d672b19aa1984c98", size = 99097 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7d/0c/4ef72754c050979fdcc06c744715ae70ea37e734816bb6514f79df77a42f/identify-2.6.1-py2.py3-none-any.whl", hash = "sha256:53863bcac7caf8d2ed85bd20312ea5dcfc22226800f6d6881f232d861db5a8f0", size = 98972 }, +] + +[[package]] +name = "idna" +version = "3.10" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f1/70/7703c29685631f5a7590aa73f1f1d3fa9a380e654b86af429e0934a32f7d/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9", size = 190490 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442 }, +] + +[[package]] +name = "importlib-metadata" +version = "8.5.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "zipp" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/cd/12/33e59336dca5be0c398a7482335911a33aa0e20776128f038019f1a95f1b/importlib_metadata-8.5.0.tar.gz", hash = "sha256:71522656f0abace1d072b9e5481a48f07c138e00f079c38c8f883823f9c26bd7", size = 55304 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a0/d9/a1e041c5e7caa9a05c925f4bdbdfb7f006d1f74996af53467bc394c97be7/importlib_metadata-8.5.0-py3-none-any.whl", hash = "sha256:45e54197d28b7a7f1559e60b95e7c567032b602131fbd588f1497f47880aa68b", size = 26514 }, +] + +[[package]] +name = "importlib-resources" +version = "6.4.5" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "zipp", marker = "python_full_version < '3.10'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/98/be/f3e8c6081b684f176b761e6a2fef02a0be939740ed6f54109a2951d806f3/importlib_resources-6.4.5.tar.gz", hash = "sha256:980862a1d16c9e147a59603677fa2aa5fd82b87f223b6cb870695bcfce830065", size = 43372 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e1/6a/4604f9ae2fa62ef47b9de2fa5ad599589d28c9fd1d335f32759813dfa91e/importlib_resources-6.4.5-py3-none-any.whl", hash = "sha256:ac29d5f956f01d5e4bb63102a5a19957f1b9175e45649977264a1416783bb717", size = 36115 }, +] + +[[package]] +name = "iniconfig" +version = "2.0.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/d7/4b/cbd8e699e64a6f16ca3a8220661b5f83792b3017d0f79807cb8708d33913/iniconfig-2.0.0.tar.gz", hash = "sha256:2d91e135bf72d31a410b17c16da610a82cb55f6b0477d1a902134b24a455b8b3", size = 4646 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ef/a6/62565a6e1cf69e10f5727360368e451d4b7f58beeac6173dc9db836a5b46/iniconfig-2.0.0-py3-none-any.whl", hash = "sha256:b6a85871a79d2e3b22d2d1b94ac2824226a63c6b741c88f7ae975f18b6778374", size = 5892 }, +] + +[[package]] +name = "inputimeout" +version = "1.0.4" +source = { registry = "https://pypi.org/simple" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/97/9c/1646ca469bc2dc299ac393c8d31136c6c22a35ca1e373fa462ac01100d37/inputimeout-1.0.4-py3-none-any.whl", hash = "sha256:f4e23d27753cfc25268eefc8d52a3edc46280ad831d226617c51882423475a43", size = 4639 }, +] + +[[package]] +name = "jaraco-classes" +version = "3.4.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "more-itertools" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/06/c0/ed4a27bc5571b99e3cff68f8a9fa5b56ff7df1c2251cc715a652ddd26402/jaraco.classes-3.4.0.tar.gz", hash = "sha256:47a024b51d0239c0dd8c8540c6c7f484be3b8fcf0b2d85c13825780d3b3f3acd", size = 11780 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7f/66/b15ce62552d84bbfcec9a4873ab79d993a1dd4edb922cbfccae192bd5b5f/jaraco.classes-3.4.0-py3-none-any.whl", hash = "sha256:f662826b6bed8cace05e7ff873ce0f9283b5c924470fe664fff1c2f00f581790", size = 6777 }, +] + +[[package]] +name = "jaraco-context" +version = "6.0.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "backports-tarfile", marker = "python_full_version < '3.12'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/df/ad/f3777b81bf0b6e7bc7514a1656d3e637b2e8e15fab2ce3235730b3e7a4e6/jaraco_context-6.0.1.tar.gz", hash = "sha256:9bae4ea555cf0b14938dc0aee7c9f32ed303aa20a3b73e7dc80111628792d1b3", size = 13912 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ff/db/0c52c4cf5e4bd9f5d7135ec7669a3a767af21b3a308e1ed3674881e52b62/jaraco.context-6.0.1-py3-none-any.whl", hash = "sha256:f797fc481b490edb305122c9181830a3a5b76d84ef6d1aef2fb9b47ab956f9e4", size = 6825 }, +] + +[[package]] +name = "jaraco-functools" +version = "4.1.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "more-itertools" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/ab/23/9894b3df5d0a6eb44611c36aec777823fc2e07740dabbd0b810e19594013/jaraco_functools-4.1.0.tar.gz", hash = "sha256:70f7e0e2ae076498e212562325e805204fc092d7b4c17e0e86c959e249701a9d", size = 19159 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/9f/4f/24b319316142c44283d7540e76c7b5a6dbd5db623abd86bb7b3491c21018/jaraco.functools-4.1.0-py3-none-any.whl", hash = "sha256:ad159f13428bc4acbf5541ad6dec511f91573b90fba04df61dafa2a1231cf649", size = 10187 }, +] + +[[package]] +name = "jeepney" +version = "0.8.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/d6/f4/154cf374c2daf2020e05c3c6a03c91348d59b23c5366e968feb198306fdf/jeepney-0.8.0.tar.gz", hash = "sha256:5efe48d255973902f6badc3ce55e2aa6c5c3b3bc642059ef3a91247bcfcc5806", size = 106005 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ae/72/2a1e2290f1ab1e06f71f3d0f1646c9e4634e70e1d37491535e19266e8dc9/jeepney-0.8.0-py3-none-any.whl", hash = "sha256:c0a454ad016ca575060802ee4d590dd912e35c122fa04e70306de3d076cce755", size = 48435 }, +] + +[[package]] +name = "jinja2" +version = "3.1.4" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "markupsafe" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/ed/55/39036716d19cab0747a5020fc7e907f362fbf48c984b14e62127f7e68e5d/jinja2-3.1.4.tar.gz", hash = "sha256:4a3aee7acbbe7303aede8e9648d13b8bf88a429282aa6122a993f0ac800cb369", size = 240245 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/31/80/3a54838c3fb461f6fec263ebf3a3a41771bd05190238de3486aae8540c36/jinja2-3.1.4-py3-none-any.whl", hash = "sha256:bc5dd2abb727a5319567b7a813e6a2e7318c39f4f487cfe6c89c6f9c7d25197d", size = 133271 }, +] + +[[package]] +name = "jsonschema" +version = "4.23.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "attrs" }, + { name = "importlib-resources", marker = "python_full_version < '3.9'" }, + { name = "jsonschema-specifications" }, + { name = "pkgutil-resolve-name", marker = "python_full_version < '3.9'" }, + { name = "referencing" }, + { name = "rpds-py" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/38/2e/03362ee4034a4c917f697890ccd4aec0800ccf9ded7f511971c75451deec/jsonschema-4.23.0.tar.gz", hash = "sha256:d71497fef26351a33265337fa77ffeb82423f3ea21283cd9467bb03999266bc4", size = 325778 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/69/4a/4f9dbeb84e8850557c02365a0eee0649abe5eb1d84af92a25731c6c0f922/jsonschema-4.23.0-py3-none-any.whl", hash = "sha256:fbadb6f8b144a8f8cf9f0b89ba94501d143e50411a1278633f56a7acf7fd5566", size = 88462 }, +] + +[[package]] +name = "jsonschema-specifications" +version = "2023.12.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "importlib-resources", marker = "python_full_version < '3.9'" }, + { name = "referencing" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/f8/b9/cc0cc592e7c195fb8a650c1d5990b10175cf13b4c97465c72ec841de9e4b/jsonschema_specifications-2023.12.1.tar.gz", hash = "sha256:48a76787b3e70f5ed53f1160d2b81f586e4ca6d1548c5de7085d1682674764cc", size = 13983 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ee/07/44bd408781594c4d0a027666ef27fab1e441b109dc3b76b4f836f8fd04fe/jsonschema_specifications-2023.12.1-py3-none-any.whl", hash = "sha256:87e4fdf3a94858b8a2ba2778d9ba57d8a9cafca7c7489c46ba0d30a8bc6a9c3c", size = 18482 }, +] + +[[package]] +name = "keyring" +version = "25.5.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "importlib-metadata", marker = "python_full_version < '3.12'" }, + { name = "importlib-resources", marker = "python_full_version < '3.9'" }, + { name = "jaraco-classes" }, + { name = "jaraco-context" }, + { name = "jaraco-functools" }, + { name = "jeepney", marker = "sys_platform == 'linux'" }, + { name = "pywin32-ctypes", marker = "sys_platform == 'win32'" }, + { name = "secretstorage", marker = "sys_platform == 'linux'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/f6/24/64447b13df6a0e2797b586dad715766d756c932ce8ace7f67bd384d76ae0/keyring-25.5.0.tar.gz", hash = "sha256:4c753b3ec91717fe713c4edd522d625889d8973a349b0e582622f49766de58e6", size = 62675 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/32/c9/353c156fa2f057e669106e5d6bcdecf85ef8d3536ce68ca96f18dc7b6d6f/keyring-25.5.0-py3-none-any.whl", hash = "sha256:e67f8ac32b04be4714b42fe84ce7dad9c40985b9ca827c592cc303e7c26d9741", size = 39096 }, +] + +[[package]] +name = "markdown-it-py" +version = "3.0.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "mdurl" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/38/71/3b932df36c1a044d397a1f92d1cf91ee0a503d91e470cbd670aa66b07ed0/markdown-it-py-3.0.0.tar.gz", hash = "sha256:e3f60a94fa066dc52ec76661e37c851cb232d92f9886b15cb560aaada2df8feb", size = 74596 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/42/d7/1ec15b46af6af88f19b8e5ffea08fa375d433c998b8a7639e76935c14f1f/markdown_it_py-3.0.0-py3-none-any.whl", hash = "sha256:355216845c60bd96232cd8d8c40e8f9765cc86f46880e43a8fd22dc1a1a8cab1", size = 87528 }, +] + +[[package]] +name = "markupsafe" +version = "2.1.5" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/87/5b/aae44c6655f3801e81aa3eef09dbbf012431987ba564d7231722f68df02d/MarkupSafe-2.1.5.tar.gz", hash = "sha256:d283d37a890ba4c1ae73ffadf8046435c76e7bc2247bbb63c00bd1a709c6544b", size = 19384 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e4/54/ad5eb37bf9d51800010a74e4665425831a9db4e7c4e0fde4352e391e808e/MarkupSafe-2.1.5-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:a17a92de5231666cfbe003f0e4b9b3a7ae3afb1ec2845aadc2bacc93ff85febc", size = 18206 }, + { url = "https://files.pythonhosted.org/packages/6a/4a/a4d49415e600bacae038c67f9fecc1d5433b9d3c71a4de6f33537b89654c/MarkupSafe-2.1.5-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:72b6be590cc35924b02c78ef34b467da4ba07e4e0f0454a2c5907f473fc50ce5", size = 14079 }, + { url = "https://files.pythonhosted.org/packages/0a/7b/85681ae3c33c385b10ac0f8dd025c30af83c78cec1c37a6aa3b55e67f5ec/MarkupSafe-2.1.5-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e61659ba32cf2cf1481e575d0462554625196a1f2fc06a1c777d3f48e8865d46", size = 26620 }, + { url = "https://files.pythonhosted.org/packages/7c/52/2b1b570f6b8b803cef5ac28fdf78c0da318916c7d2fe9402a84d591b394c/MarkupSafe-2.1.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2174c595a0d73a3080ca3257b40096db99799265e1c27cc5a610743acd86d62f", size = 25818 }, + { url = "https://files.pythonhosted.org/packages/29/fe/a36ba8c7ca55621620b2d7c585313efd10729e63ef81e4e61f52330da781/MarkupSafe-2.1.5-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ae2ad8ae6ebee9d2d94b17fb62763125f3f374c25618198f40cbb8b525411900", size = 25493 }, + { url = "https://files.pythonhosted.org/packages/60/ae/9c60231cdfda003434e8bd27282b1f4e197ad5a710c14bee8bea8a9ca4f0/MarkupSafe-2.1.5-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:075202fa5b72c86ad32dc7d0b56024ebdbcf2048c0ba09f1cde31bfdd57bcfff", size = 30630 }, + { url = "https://files.pythonhosted.org/packages/65/dc/1510be4d179869f5dafe071aecb3f1f41b45d37c02329dfba01ff59e5ac5/MarkupSafe-2.1.5-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:598e3276b64aff0e7b3451b72e94fa3c238d452e7ddcd893c3ab324717456bad", size = 29745 }, + { url = "https://files.pythonhosted.org/packages/30/39/8d845dd7d0b0613d86e0ef89549bfb5f61ed781f59af45fc96496e897f3a/MarkupSafe-2.1.5-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fce659a462a1be54d2ffcacea5e3ba2d74daa74f30f5f143fe0c58636e355fdd", size = 30021 }, + { url = "https://files.pythonhosted.org/packages/c7/5c/356a6f62e4f3c5fbf2602b4771376af22a3b16efa74eb8716fb4e328e01e/MarkupSafe-2.1.5-cp310-cp310-win32.whl", hash = "sha256:d9fad5155d72433c921b782e58892377c44bd6252b5af2f67f16b194987338a4", size = 16659 }, + { url = "https://files.pythonhosted.org/packages/69/48/acbf292615c65f0604a0c6fc402ce6d8c991276e16c80c46a8f758fbd30c/MarkupSafe-2.1.5-cp310-cp310-win_amd64.whl", hash = "sha256:bf50cd79a75d181c9181df03572cdce0fbb75cc353bc350712073108cba98de5", size = 17213 }, + { url = "https://files.pythonhosted.org/packages/11/e7/291e55127bb2ae67c64d66cef01432b5933859dfb7d6949daa721b89d0b3/MarkupSafe-2.1.5-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:629ddd2ca402ae6dbedfceeba9c46d5f7b2a61d9749597d4307f943ef198fc1f", size = 18219 }, + { url = "https://files.pythonhosted.org/packages/6b/cb/aed7a284c00dfa7c0682d14df85ad4955a350a21d2e3b06d8240497359bf/MarkupSafe-2.1.5-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:5b7b716f97b52c5a14bffdf688f971b2d5ef4029127f1ad7a513973cfd818df2", size = 14098 }, + { url = "https://files.pythonhosted.org/packages/1c/cf/35fe557e53709e93feb65575c93927942087e9b97213eabc3fe9d5b25a55/MarkupSafe-2.1.5-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6ec585f69cec0aa07d945b20805be741395e28ac1627333b1c5b0105962ffced", size = 29014 }, + { url = "https://files.pythonhosted.org/packages/97/18/c30da5e7a0e7f4603abfc6780574131221d9148f323752c2755d48abad30/MarkupSafe-2.1.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b91c037585eba9095565a3556f611e3cbfaa42ca1e865f7b8015fe5c7336d5a5", size = 28220 }, + { url = "https://files.pythonhosted.org/packages/0c/40/2e73e7d532d030b1e41180807a80d564eda53babaf04d65e15c1cf897e40/MarkupSafe-2.1.5-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7502934a33b54030eaf1194c21c692a534196063db72176b0c4028e140f8f32c", size = 27756 }, + { url = "https://files.pythonhosted.org/packages/18/46/5dca760547e8c59c5311b332f70605d24c99d1303dd9a6e1fc3ed0d73561/MarkupSafe-2.1.5-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:0e397ac966fdf721b2c528cf028494e86172b4feba51d65f81ffd65c63798f3f", size = 33988 }, + { url = "https://files.pythonhosted.org/packages/6d/c5/27febe918ac36397919cd4a67d5579cbbfa8da027fa1238af6285bb368ea/MarkupSafe-2.1.5-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:c061bb86a71b42465156a3ee7bd58c8c2ceacdbeb95d05a99893e08b8467359a", size = 32718 }, + { url = "https://files.pythonhosted.org/packages/f8/81/56e567126a2c2bc2684d6391332e357589a96a76cb9f8e5052d85cb0ead8/MarkupSafe-2.1.5-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:3a57fdd7ce31c7ff06cdfbf31dafa96cc533c21e443d57f5b1ecc6cdc668ec7f", size = 33317 }, + { url = "https://files.pythonhosted.org/packages/00/0b/23f4b2470accb53285c613a3ab9ec19dc944eaf53592cb6d9e2af8aa24cc/MarkupSafe-2.1.5-cp311-cp311-win32.whl", hash = "sha256:397081c1a0bfb5124355710fe79478cdbeb39626492b15d399526ae53422b906", size = 16670 }, + { url = "https://files.pythonhosted.org/packages/b7/a2/c78a06a9ec6d04b3445a949615c4c7ed86a0b2eb68e44e7541b9d57067cc/MarkupSafe-2.1.5-cp311-cp311-win_amd64.whl", hash = "sha256:2b7c57a4dfc4f16f7142221afe5ba4e093e09e728ca65c51f5620c9aaeb9a617", size = 17224 }, + { url = "https://files.pythonhosted.org/packages/53/bd/583bf3e4c8d6a321938c13f49d44024dbe5ed63e0a7ba127e454a66da974/MarkupSafe-2.1.5-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:8dec4936e9c3100156f8a2dc89c4b88d5c435175ff03413b443469c7c8c5f4d1", size = 18215 }, + { url = "https://files.pythonhosted.org/packages/48/d6/e7cd795fc710292c3af3a06d80868ce4b02bfbbf370b7cee11d282815a2a/MarkupSafe-2.1.5-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:3c6b973f22eb18a789b1460b4b91bf04ae3f0c4234a0a6aa6b0a92f6f7b951d4", size = 14069 }, + { url = "https://files.pythonhosted.org/packages/51/b5/5d8ec796e2a08fc814a2c7d2584b55f889a55cf17dd1a90f2beb70744e5c/MarkupSafe-2.1.5-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ac07bad82163452a6884fe8fa0963fb98c2346ba78d779ec06bd7a6262132aee", size = 29452 }, + { url = "https://files.pythonhosted.org/packages/0a/0d/2454f072fae3b5a137c119abf15465d1771319dfe9e4acbb31722a0fff91/MarkupSafe-2.1.5-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f5dfb42c4604dddc8e4305050aa6deb084540643ed5804d7455b5df8fe16f5e5", size = 28462 }, + { url = "https://files.pythonhosted.org/packages/2d/75/fd6cb2e68780f72d47e6671840ca517bda5ef663d30ada7616b0462ad1e3/MarkupSafe-2.1.5-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ea3d8a3d18833cf4304cd2fc9cbb1efe188ca9b5efef2bdac7adc20594a0e46b", size = 27869 }, + { url = "https://files.pythonhosted.org/packages/b0/81/147c477391c2750e8fc7705829f7351cf1cd3be64406edcf900dc633feb2/MarkupSafe-2.1.5-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:d050b3361367a06d752db6ead6e7edeb0009be66bc3bae0ee9d97fb326badc2a", size = 33906 }, + { url = "https://files.pythonhosted.org/packages/8b/ff/9a52b71839d7a256b563e85d11050e307121000dcebc97df120176b3ad93/MarkupSafe-2.1.5-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:bec0a414d016ac1a18862a519e54b2fd0fc8bbfd6890376898a6c0891dd82e9f", size = 32296 }, + { url = "https://files.pythonhosted.org/packages/88/07/2dc76aa51b481eb96a4c3198894f38b480490e834479611a4053fbf08623/MarkupSafe-2.1.5-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:58c98fee265677f63a4385256a6d7683ab1832f3ddd1e66fe948d5880c21a169", size = 33038 }, + { url = "https://files.pythonhosted.org/packages/96/0c/620c1fb3661858c0e37eb3cbffd8c6f732a67cd97296f725789679801b31/MarkupSafe-2.1.5-cp312-cp312-win32.whl", hash = "sha256:8590b4ae07a35970728874632fed7bd57b26b0102df2d2b233b6d9d82f6c62ad", size = 16572 }, + { url = "https://files.pythonhosted.org/packages/3f/14/c3554d512d5f9100a95e737502f4a2323a1959f6d0d01e0d0997b35f7b10/MarkupSafe-2.1.5-cp312-cp312-win_amd64.whl", hash = "sha256:823b65d8706e32ad2df51ed89496147a42a2a6e01c13cfb6ffb8b1e92bc910bb", size = 17127 }, + { url = "https://files.pythonhosted.org/packages/f8/ff/2c942a82c35a49df5de3a630ce0a8456ac2969691b230e530ac12314364c/MarkupSafe-2.1.5-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:656f7526c69fac7f600bd1f400991cc282b417d17539a1b228617081106feb4a", size = 18192 }, + { url = "https://files.pythonhosted.org/packages/4f/14/6f294b9c4f969d0c801a4615e221c1e084722ea6114ab2114189c5b8cbe0/MarkupSafe-2.1.5-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:97cafb1f3cbcd3fd2b6fbfb99ae11cdb14deea0736fc2b0952ee177f2b813a46", size = 14072 }, + { url = "https://files.pythonhosted.org/packages/81/d4/fd74714ed30a1dedd0b82427c02fa4deec64f173831ec716da11c51a50aa/MarkupSafe-2.1.5-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1f3fbcb7ef1f16e48246f704ab79d79da8a46891e2da03f8783a5b6fa41a9532", size = 26928 }, + { url = "https://files.pythonhosted.org/packages/c7/bd/50319665ce81bb10e90d1cf76f9e1aa269ea6f7fa30ab4521f14d122a3df/MarkupSafe-2.1.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fa9db3f79de01457b03d4f01b34cf91bc0048eb2c3846ff26f66687c2f6d16ab", size = 26106 }, + { url = "https://files.pythonhosted.org/packages/4c/6f/f2b0f675635b05f6afd5ea03c094557bdb8622fa8e673387444fe8d8e787/MarkupSafe-2.1.5-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ffee1f21e5ef0d712f9033568f8344d5da8cc2869dbd08d87c84656e6a2d2f68", size = 25781 }, + { url = "https://files.pythonhosted.org/packages/51/e0/393467cf899b34a9d3678e78961c2c8cdf49fb902a959ba54ece01273fb1/MarkupSafe-2.1.5-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:5dedb4db619ba5a2787a94d877bc8ffc0566f92a01c0ef214865e54ecc9ee5e0", size = 30518 }, + { url = "https://files.pythonhosted.org/packages/f6/02/5437e2ad33047290dafced9df741d9efc3e716b75583bbd73a9984f1b6f7/MarkupSafe-2.1.5-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:30b600cf0a7ac9234b2638fbc0fb6158ba5bdcdf46aeb631ead21248b9affbc4", size = 29669 }, + { url = "https://files.pythonhosted.org/packages/0e/7d/968284145ffd9d726183ed6237c77938c021abacde4e073020f920e060b2/MarkupSafe-2.1.5-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:8dd717634f5a044f860435c1d8c16a270ddf0ef8588d4887037c5028b859b0c3", size = 29933 }, + { url = "https://files.pythonhosted.org/packages/bf/f3/ecb00fc8ab02b7beae8699f34db9357ae49d9f21d4d3de6f305f34fa949e/MarkupSafe-2.1.5-cp38-cp38-win32.whl", hash = "sha256:daa4ee5a243f0f20d528d939d06670a298dd39b1ad5f8a72a4275124a7819eff", size = 16656 }, + { url = "https://files.pythonhosted.org/packages/92/21/357205f03514a49b293e214ac39de01fadd0970a6e05e4bf1ddd0ffd0881/MarkupSafe-2.1.5-cp38-cp38-win_amd64.whl", hash = "sha256:619bc166c4f2de5caa5a633b8b7326fbe98e0ccbfacabd87268a2b15ff73a029", size = 17206 }, + { url = "https://files.pythonhosted.org/packages/0f/31/780bb297db036ba7b7bbede5e1d7f1e14d704ad4beb3ce53fb495d22bc62/MarkupSafe-2.1.5-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:7a68b554d356a91cce1236aa7682dc01df0edba8d043fd1ce607c49dd3c1edcf", size = 18193 }, + { url = "https://files.pythonhosted.org/packages/6c/77/d77701bbef72892affe060cdacb7a2ed7fd68dae3b477a8642f15ad3b132/MarkupSafe-2.1.5-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:db0b55e0f3cc0be60c1f19efdde9a637c32740486004f20d1cff53c3c0ece4d2", size = 14073 }, + { url = "https://files.pythonhosted.org/packages/d9/a7/1e558b4f78454c8a3a0199292d96159eb4d091f983bc35ef258314fe7269/MarkupSafe-2.1.5-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3e53af139f8579a6d5f7b76549125f0d94d7e630761a2111bc431fd820e163b8", size = 26486 }, + { url = "https://files.pythonhosted.org/packages/5f/5a/360da85076688755ea0cceb92472923086993e86b5613bbae9fbc14136b0/MarkupSafe-2.1.5-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:17b950fccb810b3293638215058e432159d2b71005c74371d784862b7e4683f3", size = 25685 }, + { url = "https://files.pythonhosted.org/packages/6a/18/ae5a258e3401f9b8312f92b028c54d7026a97ec3ab20bfaddbdfa7d8cce8/MarkupSafe-2.1.5-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4c31f53cdae6ecfa91a77820e8b151dba54ab528ba65dfd235c80b086d68a465", size = 25338 }, + { url = "https://files.pythonhosted.org/packages/0b/cc/48206bd61c5b9d0129f4d75243b156929b04c94c09041321456fd06a876d/MarkupSafe-2.1.5-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:bff1b4290a66b490a2f4719358c0cdcd9bafb6b8f061e45c7a2460866bf50c2e", size = 30439 }, + { url = "https://files.pythonhosted.org/packages/d1/06/a41c112ab9ffdeeb5f77bc3e331fdadf97fa65e52e44ba31880f4e7f983c/MarkupSafe-2.1.5-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:bc1667f8b83f48511b94671e0e441401371dfd0f0a795c7daa4a3cd1dde55bea", size = 29531 }, + { url = "https://files.pythonhosted.org/packages/02/8c/ab9a463301a50dab04d5472e998acbd4080597abc048166ded5c7aa768c8/MarkupSafe-2.1.5-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:5049256f536511ee3f7e1b3f87d1d1209d327e818e6ae1365e8653d7e3abb6a6", size = 29823 }, + { url = "https://files.pythonhosted.org/packages/bc/29/9bc18da763496b055d8e98ce476c8e718dcfd78157e17f555ce6dd7d0895/MarkupSafe-2.1.5-cp39-cp39-win32.whl", hash = "sha256:00e046b6dd71aa03a41079792f8473dc494d564611a8f89bbbd7cb93295ebdcf", size = 16658 }, + { url = "https://files.pythonhosted.org/packages/f6/f8/4da07de16f10551ca1f640c92b5f316f9394088b183c6a57183df6de5ae4/MarkupSafe-2.1.5-cp39-cp39-win_amd64.whl", hash = "sha256:fa173ec60341d6bb97a89f5ea19c85c5643c1e7dedebc22f5181eb73573142c5", size = 17211 }, +] + +[[package]] +name = "mdurl" +version = "0.1.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/d6/54/cfe61301667036ec958cb99bd3efefba235e65cdeb9c84d24a8293ba1d90/mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba", size = 8729 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979 }, +] + +[[package]] +name = "more-itertools" +version = "10.5.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/51/78/65922308c4248e0eb08ebcbe67c95d48615cc6f27854b6f2e57143e9178f/more-itertools-10.5.0.tar.gz", hash = "sha256:5482bfef7849c25dc3c6dd53a6173ae4795da2a41a80faea6700d9f5846c5da6", size = 121020 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/48/7e/3a64597054a70f7c86eb0a7d4fc315b8c1ab932f64883a297bdffeb5f967/more_itertools-10.5.0-py3-none-any.whl", hash = "sha256:037b0d3203ce90cca8ab1defbbdac29d5f993fc20131f3664dc8d6acfa872aef", size = 60952 }, +] + +[[package]] +name = "mypy-extensions" +version = "1.0.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/98/a4/1ab47638b92648243faf97a5aeb6ea83059cc3624972ab6b8d2316078d3f/mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782", size = 4433 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2a/e2/5d3f6ada4297caebe1a2add3b126fe800c96f56dbe5d1988a2cbe0b267aa/mypy_extensions-1.0.0-py3-none-any.whl", hash = "sha256:4392f6c0eb8a5668a69e23d168ffa70f0be9ccfd32b5cc2d26a34ae5b844552d", size = 4695 }, +] + +[[package]] +name = "nh3" +version = "0.2.18" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/62/73/10df50b42ddb547a907deeb2f3c9823022580a7a47281e8eae8e003a9639/nh3-0.2.18.tar.gz", hash = "sha256:94a166927e53972a9698af9542ace4e38b9de50c34352b962f4d9a7d4c927af4", size = 15028 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b3/89/1daff5d9ba5a95a157c092c7c5f39b8dd2b1ddb4559966f808d31cfb67e0/nh3-0.2.18-cp37-abi3-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl", hash = "sha256:14c5a72e9fe82aea5fe3072116ad4661af5cf8e8ff8fc5ad3450f123e4925e86", size = 1374474 }, + { url = "https://files.pythonhosted.org/packages/2c/b6/42fc3c69cabf86b6b81e4c051a9b6e249c5ba9f8155590222c2622961f58/nh3-0.2.18-cp37-abi3-macosx_10_12_x86_64.whl", hash = "sha256:7b7c2a3c9eb1a827d42539aa64091640bd275b81e097cd1d8d82ef91ffa2e811", size = 694573 }, + { url = "https://files.pythonhosted.org/packages/45/b9/833f385403abaf0023c6547389ec7a7acf141ddd9d1f21573723a6eab39a/nh3-0.2.18-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42c64511469005058cd17cc1537578eac40ae9f7200bedcfd1fc1a05f4f8c200", size = 844082 }, + { url = "https://files.pythonhosted.org/packages/05/2b/85977d9e11713b5747595ee61f381bc820749daf83f07b90b6c9964cf932/nh3-0.2.18-cp37-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:0411beb0589eacb6734f28d5497ca2ed379eafab8ad8c84b31bb5c34072b7164", size = 782460 }, + { url = "https://files.pythonhosted.org/packages/72/f2/5c894d5265ab80a97c68ca36f25c8f6f0308abac649aaf152b74e7e854a8/nh3-0.2.18-cp37-abi3-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:5f36b271dae35c465ef5e9090e1fdaba4a60a56f0bb0ba03e0932a66f28b9189", size = 879827 }, + { url = "https://files.pythonhosted.org/packages/ab/a7/375afcc710dbe2d64cfbd69e31f82f3e423d43737258af01f6a56d844085/nh3-0.2.18-cp37-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:34c03fa78e328c691f982b7c03d4423bdfd7da69cd707fe572f544cf74ac23ad", size = 841080 }, + { url = "https://files.pythonhosted.org/packages/c2/a8/3bb02d0c60a03ad3a112b76c46971e9480efa98a8946677b5a59f60130ca/nh3-0.2.18-cp37-abi3-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:19aaba96e0f795bd0a6c56291495ff59364f4300d4a39b29a0abc9cb3774a84b", size = 924144 }, + { url = "https://files.pythonhosted.org/packages/1b/63/6ab90d0e5225ab9780f6c9fb52254fa36b52bb7c188df9201d05b647e5e1/nh3-0.2.18-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:de3ceed6e661954871d6cd78b410213bdcb136f79aafe22aa7182e028b8c7307", size = 769192 }, + { url = "https://files.pythonhosted.org/packages/a4/17/59391c28580e2c32272761629893e761442fc7666da0b1cdb479f3b67b88/nh3-0.2.18-cp37-abi3-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6955369e4d9f48f41e3f238a9e60f9410645db7e07435e62c6a9ea6135a4907f", size = 791042 }, + { url = "https://files.pythonhosted.org/packages/a3/da/0c4e282bc3cff4a0adf37005fa1fb42257673fbc1bbf7d1ff639ec3d255a/nh3-0.2.18-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:f0eca9ca8628dbb4e916ae2491d72957fdd35f7a5d326b7032a345f111ac07fe", size = 1010073 }, + { url = "https://files.pythonhosted.org/packages/de/81/c291231463d21da5f8bba82c8167a6d6893cc5419b0639801ee5d3aeb8a9/nh3-0.2.18-cp37-abi3-musllinux_1_2_armv7l.whl", hash = "sha256:3a157ab149e591bb638a55c8c6bcb8cdb559c8b12c13a8affaba6cedfe51713a", size = 1029782 }, + { url = "https://files.pythonhosted.org/packages/63/1d/842fed85cf66c973be0aed8770093d6a04741f65e2c388ddd4c07fd3296e/nh3-0.2.18-cp37-abi3-musllinux_1_2_i686.whl", hash = "sha256:c8b3a1cebcba9b3669ed1a84cc65bf005728d2f0bc1ed2a6594a992e817f3a50", size = 942504 }, + { url = "https://files.pythonhosted.org/packages/eb/61/73a007c74c37895fdf66e0edcd881f5eaa17a348ff02f4bb4bc906d61085/nh3-0.2.18-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:36c95d4b70530b320b365659bb5034341316e6a9b30f0b25fa9c9eff4c27a204", size = 941541 }, + { url = "https://files.pythonhosted.org/packages/78/48/54a788fc9428e481b2f58e0cd8564f6c74ffb6e9ef73d39e8acbeae8c629/nh3-0.2.18-cp37-abi3-win32.whl", hash = "sha256:a7f1b5b2c15866f2db413a3649a8fe4fd7b428ae58be2c0f6bca5eefd53ca2be", size = 573750 }, + { url = "https://files.pythonhosted.org/packages/26/8d/53c5b19c4999bdc6ba95f246f4ef35ca83d7d7423e5e38be43ad66544e5d/nh3-0.2.18-cp37-abi3-win_amd64.whl", hash = "sha256:8ce0f819d2f1933953fca255db2471ad58184a60508f03e6285e5114b6254844", size = 579012 }, +] + +[[package]] +name = "nodeenv" +version = "1.9.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/43/16/fc88b08840de0e0a72a2f9d8c6bae36be573e475a6326ae854bcc549fc45/nodeenv-1.9.1.tar.gz", hash = "sha256:6ec12890a2dab7946721edbfbcd91f3319c6ccc9aec47be7c7e6b7011ee6645f", size = 47437 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d2/1d/1b658dbd2b9fa9c4c9f32accbfc0205d532c8c6194dc0f2a4c0428e7128a/nodeenv-1.9.1-py2.py3-none-any.whl", hash = "sha256:ba11c9782d29c27c70ffbdda2d7415098754709be8a7056d79a737cd901155c9", size = 22314 }, +] + +[[package]] +name = "packaging" +version = "24.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/51/65/50db4dda066951078f0a96cf12f4b9ada6e4b811516bf0262c0f4f7064d4/packaging-24.1.tar.gz", hash = "sha256:026ed72c8ed3fcce5bf8950572258698927fd1dbda10a5e981cdf0ac37f4f002", size = 148788 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/08/aa/cc0199a5f0ad350994d660967a8efb233fe0416e4639146c089643407ce6/packaging-24.1-py3-none-any.whl", hash = "sha256:5b8f2217dbdbd2f7f384c41c628544e6d52f2d0f53c6d0c3ea61aa5d1d7ff124", size = 53985 }, +] + +[[package]] +name = "pathspec" +version = "0.12.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ca/bc/f35b8446f4531a7cb215605d100cd88b7ac6f44ab3fc94870c120ab3adbf/pathspec-0.12.1.tar.gz", hash = "sha256:a482d51503a1ab33b1c67a6c3813a26953dbdc71c31dacaef9a838c4e29f5712", size = 51043 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/cc/20/ff623b09d963f88bfde16306a54e12ee5ea43e9b597108672ff3a408aad6/pathspec-0.12.1-py3-none-any.whl", hash = "sha256:a0d503e138a4c123b27490a4f7beda6a01c6f288df0e4a8b79c7eb0dc7b4cc08", size = 31191 }, +] + +[[package]] +name = "pexpect" +version = "4.9.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "ptyprocess" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/42/92/cc564bf6381ff43ce1f4d06852fc19a2f11d180f23dc32d9588bee2f149d/pexpect-4.9.0.tar.gz", hash = "sha256:ee7d41123f3c9911050ea2c2dac107568dc43b2d3b0c7557a33212c398ead30f", size = 166450 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/9e/c3/059298687310d527a58bb01f3b1965787ee3b40dce76752eda8b44e9a2c5/pexpect-4.9.0-py2.py3-none-any.whl", hash = "sha256:7236d1e080e4936be2dc3e326cec0af72acf9212a7e1d060210e70a47e253523", size = 63772 }, +] + +[[package]] +name = "pip" +version = "24.3.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f4/b1/b422acd212ad7eedddaf7981eee6e5de085154ff726459cf2da7c5a184c1/pip-24.3.1.tar.gz", hash = "sha256:ebcb60557f2aefabc2e0f918751cd24ea0d56d8ec5445fe1807f1d2109660b99", size = 1931073 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ef/7d/500c9ad20238fcfcb4cb9243eede163594d7020ce87bd9610c9e02771876/pip-24.3.1-py3-none-any.whl", hash = "sha256:3790624780082365f47549d032f3770eeb2b1e8bd1f7b2e02dace1afa361b4ed", size = 1822182 }, +] + +[[package]] +name = "pipx" +version = "1.7.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "argcomplete" }, + { name = "colorama", marker = "sys_platform == 'win32'" }, + { name = "packaging" }, + { name = "platformdirs" }, + { name = "tomli", marker = "python_full_version < '3.11'" }, + { name = "userpath" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/17/21/dd6b9a9c4f0cb659ce3dad991f0e8dde852b2c81922224ef77df4222ab7a/pipx-1.7.1.tar.gz", hash = "sha256:762de134e16a462be92645166d225ecef446afaef534917f5f70008d63584360", size = 291889 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/35/af/66db02a214590a841bcd1df1f02f7ef818dc3f43487acddab0b8c40b25d2/pipx-1.7.1-py3-none-any.whl", hash = "sha256:3933c43bb344e649cb28e10d357e0967ce8572f1c19caf90cf39ae95c2a0afaf", size = 78749 }, +] + +[[package]] +name = "pkginfo" +version = "1.10.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/2f/72/347ec5be4adc85c182ed2823d8d1c7b51e13b9a6b0c1aae59582eca652df/pkginfo-1.10.0.tar.gz", hash = "sha256:5df73835398d10db79f8eecd5cd86b1f6d29317589ea70796994d49399af6297", size = 378457 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/56/09/054aea9b7534a15ad38a363a2bd974c20646ab1582a387a95b8df1bfea1c/pkginfo-1.10.0-py3-none-any.whl", hash = "sha256:889a6da2ed7ffc58ab5b900d888ddce90bce912f2d2de1dc1c26f4cb9fe65097", size = 30392 }, +] + +[[package]] +name = "pkgutil-resolve-name" +version = "1.3.10" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/70/f2/f2891a9dc37398696ddd945012b90ef8d0a034f0012e3f83c3f7a70b0f79/pkgutil_resolve_name-1.3.10.tar.gz", hash = "sha256:357d6c9e6a755653cfd78893817c0853af365dd51ec97f3d358a819373bbd174", size = 5054 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c9/5c/3d4882ba113fd55bdba9326c1e4c62a15e674a2501de4869e6bd6301f87e/pkgutil_resolve_name-1.3.10-py3-none-any.whl", hash = "sha256:ca27cc078d25c5ad71a9de0a7a330146c4e014c2462d9af19c6b828280649c5e", size = 4734 }, +] + +[[package]] +name = "platformdirs" +version = "4.3.6" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/13/fc/128cc9cb8f03208bdbf93d3aa862e16d376844a14f9a0ce5cf4507372de4/platformdirs-4.3.6.tar.gz", hash = "sha256:357fb2acbc885b0419afd3ce3ed34564c13c9b95c89360cd9563f73aa5e2b907", size = 21302 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/3c/a6/bc1012356d8ece4d66dd75c4b9fc6c1f6650ddd5991e421177d9f8f671be/platformdirs-4.3.6-py3-none-any.whl", hash = "sha256:73e575e1408ab8103900836b97580d5307456908a03e92031bab39e4554cc3fb", size = 18439 }, +] + +[[package]] +name = "pluggy" +version = "1.5.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/96/2d/02d4312c973c6050a18b314a5ad0b3210edb65a906f868e31c111dede4a6/pluggy-1.5.0.tar.gz", hash = "sha256:2cffa88e94fdc978c4c574f15f9e59b7f4201d439195c3715ca9e2486f1d0cf1", size = 67955 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/88/5f/e351af9a41f866ac3f1fac4ca0613908d9a41741cfcf2228f4ad853b697d/pluggy-1.5.0-py3-none-any.whl", hash = "sha256:44e1ad92c8ca002de6377e165f3e0f1be63266ab4d554740532335b9d75ea669", size = 20556 }, +] + +[[package]] +name = "pre-commit" +version = "3.5.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "cfgv" }, + { name = "identify" }, + { name = "nodeenv" }, + { name = "pyyaml" }, + { name = "virtualenv" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/04/b3/4ae08d21eb097162f5aad37f4585f8069a86402ed7f5362cc9ae097f9572/pre_commit-3.5.0.tar.gz", hash = "sha256:5804465c675b659b0862f07907f96295d490822a450c4c40e747d0b1c6ebcb32", size = 177079 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/6c/75/526915fedf462e05eeb1c75ceaf7e3f9cde7b5ce6f62740fe5f7f19a0050/pre_commit-3.5.0-py2.py3-none-any.whl", hash = "sha256:841dc9aef25daba9a0238cd27984041fa0467b4199fc4852e27950664919f660", size = 203698 }, +] + +[[package]] +name = "psutil" +version = "6.1.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/26/10/2a30b13c61e7cf937f4adf90710776b7918ed0a9c434e2c38224732af310/psutil-6.1.0.tar.gz", hash = "sha256:353815f59a7f64cdaca1c0307ee13558a0512f6db064e92fe833784f08539c7a", size = 508565 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/01/9e/8be43078a171381953cfee33c07c0d628594b5dbfc5157847b85022c2c1b/psutil-6.1.0-cp36-abi3-macosx_10_9_x86_64.whl", hash = "sha256:6e2dcd475ce8b80522e51d923d10c7871e45f20918e027ab682f94f1c6351688", size = 247762 }, + { url = "https://files.pythonhosted.org/packages/1d/cb/313e80644ea407f04f6602a9e23096540d9dc1878755f3952ea8d3d104be/psutil-6.1.0-cp36-abi3-macosx_11_0_arm64.whl", hash = "sha256:0895b8414afafc526712c498bd9de2b063deaac4021a3b3c34566283464aff8e", size = 248777 }, + { url = "https://files.pythonhosted.org/packages/65/8e/bcbe2025c587b5d703369b6a75b65d41d1367553da6e3f788aff91eaf5bd/psutil-6.1.0-cp36-abi3-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9dcbfce5d89f1d1f2546a2090f4fcf87c7f669d1d90aacb7d7582addece9fb38", size = 284259 }, + { url = "https://files.pythonhosted.org/packages/58/4d/8245e6f76a93c98aab285a43ea71ff1b171bcd90c9d238bf81f7021fb233/psutil-6.1.0-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:498c6979f9c6637ebc3a73b3f87f9eb1ec24e1ce53a7c5173b8508981614a90b", size = 287255 }, + { url = "https://files.pythonhosted.org/packages/27/c2/d034856ac47e3b3cdfa9720d0e113902e615f4190d5d1bdb8df4b2015fb2/psutil-6.1.0-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d905186d647b16755a800e7263d43df08b790d709d575105d419f8b6ef65423a", size = 288804 }, + { url = "https://files.pythonhosted.org/packages/ea/55/5389ed243c878725feffc0d6a3bc5ef6764312b6fc7c081faaa2cfa7ef37/psutil-6.1.0-cp37-abi3-win32.whl", hash = "sha256:1ad45a1f5d0b608253b11508f80940985d1d0c8f6111b5cb637533a0e6ddc13e", size = 250386 }, + { url = "https://files.pythonhosted.org/packages/11/91/87fa6f060e649b1e1a7b19a4f5869709fbf750b7c8c262ee776ec32f3028/psutil-6.1.0-cp37-abi3-win_amd64.whl", hash = "sha256:a8fb3752b491d246034fa4d279ff076501588ce8cbcdbb62c32fd7a377d996be", size = 254228 }, +] + +[[package]] +name = "ptyprocess" +version = "0.7.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/20/e5/16ff212c1e452235a90aeb09066144d0c5a6a8c0834397e03f5224495c4e/ptyprocess-0.7.0.tar.gz", hash = "sha256:5c5d0a3b48ceee0b48485e0c26037c0acd7d29765ca3fbb5cb3831d347423220", size = 70762 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/22/a6/858897256d0deac81a172289110f31629fc4cee19b6f01283303e18c8db3/ptyprocess-0.7.0-py2.py3-none-any.whl", hash = "sha256:4b41f3967fce3af57cc7e94b888626c18bf37a083e3651ca8feeb66d492fef35", size = 13993 }, +] + +[[package]] +name = "pycparser" +version = "2.22" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/1d/b2/31537cf4b1ca988837256c910a668b553fceb8f069bedc4b1c826024b52c/pycparser-2.22.tar.gz", hash = "sha256:491c8be9c040f5390f5bf44a5b07752bd07f56edf992381b05c701439eec10f6", size = 172736 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/13/a3/a812df4e2dd5696d1f351d58b8fe16a405b234ad2886a0dab9183fb78109/pycparser-2.22-py3-none-any.whl", hash = "sha256:c3702b6d3dd8c7abc1afa565d7e63d53a1d0bd86cdc24edd75470f4de499cfcc", size = 117552 }, +] + +[[package]] +name = "pygithub" +version = "2.4.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "deprecated" }, + { name = "pyjwt", extra = ["crypto"] }, + { name = "pynacl" }, + { name = "requests" }, + { name = "typing-extensions" }, + { name = "urllib3" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/f1/a0/1e8b8ca88df9857836f5bf8e3ee15dfb810d19814ef700b12f99ce11f691/pygithub-2.4.0.tar.gz", hash = "sha256:6601e22627e87bac192f1e2e39c6e6f69a43152cfb8f307cee575879320b3051", size = 3476673 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/0a/f3/e185613c411757c0c18b904ea2db173f2872397eddf444a3fe8cdde47077/PyGithub-2.4.0-py3-none-any.whl", hash = "sha256:81935aa4bdc939fba98fee1cb47422c09157c56a27966476ff92775602b9ee24", size = 362599 }, +] + +[[package]] +name = "pygments" +version = "2.18.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/8e/62/8336eff65bcbc8e4cb5d05b55faf041285951b6e80f33e2bff2024788f31/pygments-2.18.0.tar.gz", hash = "sha256:786ff802f32e91311bff3889f6e9a86e81505fe99f2735bb6d60ae0c5004f199", size = 4891905 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/f7/3f/01c8b82017c199075f8f788d0d906b9ffbbc5a47dc9918a945e13d5a2bda/pygments-2.18.0-py3-none-any.whl", hash = "sha256:b8e6aca0523f3ab76fee51799c488e38782ac06eafcf95e7ba832985c8e7b13a", size = 1205513 }, +] + +[[package]] +name = "pyjwt" +version = "2.9.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/fb/68/ce067f09fca4abeca8771fe667d89cc347d1e99da3e093112ac329c6020e/pyjwt-2.9.0.tar.gz", hash = "sha256:7e1e5b56cc735432a7369cbfa0efe50fa113ebecdc04ae6922deba8b84582d0c", size = 78825 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/79/84/0fdf9b18ba31d69877bd39c9cd6052b47f3761e9910c15de788e519f079f/PyJWT-2.9.0-py3-none-any.whl", hash = "sha256:3b02fb0f44517787776cf48f2ae25d8e14f300e6d7545a4315cee571a415e850", size = 22344 }, +] + +[package.optional-dependencies] +crypto = [ + { name = "cryptography" }, +] + +[[package]] +name = "pynacl" +version = "1.5.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "cffi" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/a7/22/27582568be639dfe22ddb3902225f91f2f17ceff88ce80e4db396c8986da/PyNaCl-1.5.0.tar.gz", hash = "sha256:8ac7448f09ab85811607bdd21ec2464495ac8b7c66d146bf545b0f08fb9220ba", size = 3392854 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ce/75/0b8ede18506041c0bf23ac4d8e2971b4161cd6ce630b177d0a08eb0d8857/PyNaCl-1.5.0-cp36-abi3-macosx_10_10_universal2.whl", hash = "sha256:401002a4aaa07c9414132aaed7f6836ff98f59277a234704ff66878c2ee4a0d1", size = 349920 }, + { url = "https://files.pythonhosted.org/packages/59/bb/fddf10acd09637327a97ef89d2a9d621328850a72f1fdc8c08bdf72e385f/PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:52cb72a79269189d4e0dc537556f4740f7f0a9ec41c1322598799b0bdad4ef92", size = 601722 }, + { url = "https://files.pythonhosted.org/packages/5d/70/87a065c37cca41a75f2ce113a5a2c2aa7533be648b184ade58971b5f7ccc/PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a36d4a9dda1f19ce6e03c9a784a2921a4b726b02e1c736600ca9c22029474394", size = 680087 }, + { url = "https://files.pythonhosted.org/packages/ee/87/f1bb6a595f14a327e8285b9eb54d41fef76c585a0edef0a45f6fc95de125/PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:0c84947a22519e013607c9be43706dd42513f9e6ae5d39d3613ca1e142fba44d", size = 856678 }, + { url = "https://files.pythonhosted.org/packages/66/28/ca86676b69bf9f90e710571b67450508484388bfce09acf8a46f0b8c785f/PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:06b8f6fa7f5de8d5d2f7573fe8c863c051225a27b61e6860fd047b1775807858", size = 1133660 }, + { url = "https://files.pythonhosted.org/packages/3d/85/c262db650e86812585e2bc59e497a8f59948a005325a11bbbc9ecd3fe26b/PyNaCl-1.5.0-cp36-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:a422368fc821589c228f4c49438a368831cb5bbc0eab5ebe1d7fac9dded6567b", size = 663824 }, + { url = "https://files.pythonhosted.org/packages/fd/1a/cc308a884bd299b651f1633acb978e8596c71c33ca85e9dc9fa33a5399b9/PyNaCl-1.5.0-cp36-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:61f642bf2378713e2c2e1de73444a3778e5f0a38be6fee0fe532fe30060282ff", size = 1117912 }, + { url = "https://files.pythonhosted.org/packages/25/2d/b7df6ddb0c2a33afdb358f8af6ea3b8c4d1196ca45497dd37a56f0c122be/PyNaCl-1.5.0-cp36-abi3-win32.whl", hash = "sha256:e46dae94e34b085175f8abb3b0aaa7da40767865ac82c928eeb9e57e1ea8a543", size = 204624 }, + { url = "https://files.pythonhosted.org/packages/5e/22/d3db169895faaf3e2eda892f005f433a62db2decbcfbc2f61e6517adfa87/PyNaCl-1.5.0-cp36-abi3-win_amd64.whl", hash = "sha256:20f42270d27e1b6a29f54032090b972d97f0a1b0948cc52392041ef7831fee93", size = 212141 }, +] + +[[package]] +name = "pytest" +version = "8.3.3" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, + { name = "exceptiongroup", marker = "python_full_version < '3.11'" }, + { name = "iniconfig" }, + { name = "packaging" }, + { name = "pluggy" }, + { name = "tomli", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/8b/6c/62bbd536103af674e227c41a8f3dcd022d591f6eed5facb5a0f31ee33bbc/pytest-8.3.3.tar.gz", hash = "sha256:70b98107bd648308a7952b06e6ca9a50bc660be218d53c257cc1fc94fda10181", size = 1442487 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/6b/77/7440a06a8ead44c7757a64362dd22df5760f9b12dc5f11b6188cd2fc27a0/pytest-8.3.3-py3-none-any.whl", hash = "sha256:a6853c7375b2663155079443d2e45de913a911a11d669df02a50814944db57b2", size = 342341 }, +] + +[[package]] +name = "pytest-xdist" +version = "3.6.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "execnet" }, + { name = "pytest" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/41/c4/3c310a19bc1f1e9ef50075582652673ef2bfc8cd62afef9585683821902f/pytest_xdist-3.6.1.tar.gz", hash = "sha256:ead156a4db231eec769737f57668ef58a2084a34b2e55c4a8fa20d861107300d", size = 84060 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/6d/82/1d96bf03ee4c0fdc3c0cbe61470070e659ca78dc0086fb88b66c185e2449/pytest_xdist-3.6.1-py3-none-any.whl", hash = "sha256:9ed4adfb68a016610848639bb7e02c9352d5d9f03d04809919e2dafc3be4cca7", size = 46108 }, +] + +[[package]] +name = "pywin32-ctypes" +version = "0.2.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/85/9f/01a1a99704853cb63f253eea009390c88e7131c67e66a0a02099a8c917cb/pywin32-ctypes-0.2.3.tar.gz", hash = "sha256:d162dc04946d704503b2edc4d55f3dba5c1d539ead017afa00142c38b9885755", size = 29471 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/de/3d/8161f7711c017e01ac9f008dfddd9410dff3674334c233bde66e7ba65bbf/pywin32_ctypes-0.2.3-py3-none-any.whl", hash = "sha256:8a1513379d709975552d202d942d9837758905c8d01eb82b8bcc30918929e7b8", size = 30756 }, +] + +[[package]] +name = "pyyaml" +version = "6.0.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/54/ed/79a089b6be93607fa5cdaedf301d7dfb23af5f25c398d5ead2525b063e17/pyyaml-6.0.2.tar.gz", hash = "sha256:d584d9ec91ad65861cc08d42e834324ef890a082e591037abe114850ff7bbc3e", size = 130631 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/9b/95/a3fac87cb7158e231b5a6012e438c647e1a87f09f8e0d123acec8ab8bf71/PyYAML-6.0.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:0a9a2848a5b7feac301353437eb7d5957887edbf81d56e903999a75a3d743086", size = 184199 }, + { url = "https://files.pythonhosted.org/packages/c7/7a/68bd47624dab8fd4afbfd3c48e3b79efe09098ae941de5b58abcbadff5cb/PyYAML-6.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:29717114e51c84ddfba879543fb232a6ed60086602313ca38cce623c1d62cfbf", size = 171758 }, + { url = "https://files.pythonhosted.org/packages/49/ee/14c54df452143b9ee9f0f29074d7ca5516a36edb0b4cc40c3f280131656f/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8824b5a04a04a047e72eea5cec3bc266db09e35de6bdfe34c9436ac5ee27d237", size = 718463 }, + { url = "https://files.pythonhosted.org/packages/4d/61/de363a97476e766574650d742205be468921a7b532aa2499fcd886b62530/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7c36280e6fb8385e520936c3cb3b8042851904eba0e58d277dca80a5cfed590b", size = 719280 }, + { url = "https://files.pythonhosted.org/packages/6b/4e/1523cb902fd98355e2e9ea5e5eb237cbc5f3ad5f3075fa65087aa0ecb669/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ec031d5d2feb36d1d1a24380e4db6d43695f3748343d99434e6f5f9156aaa2ed", size = 751239 }, + { url = "https://files.pythonhosted.org/packages/b7/33/5504b3a9a4464893c32f118a9cc045190a91637b119a9c881da1cf6b7a72/PyYAML-6.0.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:936d68689298c36b53b29f23c6dbb74de12b4ac12ca6cfe0e047bedceea56180", size = 695802 }, + { url = "https://files.pythonhosted.org/packages/5c/20/8347dcabd41ef3a3cdc4f7b7a2aff3d06598c8779faa189cdbf878b626a4/PyYAML-6.0.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:23502f431948090f597378482b4812b0caae32c22213aecf3b55325e049a6c68", size = 720527 }, + { url = "https://files.pythonhosted.org/packages/be/aa/5afe99233fb360d0ff37377145a949ae258aaab831bde4792b32650a4378/PyYAML-6.0.2-cp310-cp310-win32.whl", hash = "sha256:2e99c6826ffa974fe6e27cdb5ed0021786b03fc98e5ee3c5bfe1fd5015f42b99", size = 144052 }, + { url = "https://files.pythonhosted.org/packages/b5/84/0fa4b06f6d6c958d207620fc60005e241ecedceee58931bb20138e1e5776/PyYAML-6.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:a4d3091415f010369ae4ed1fc6b79def9416358877534caf6a0fdd2146c87a3e", size = 161774 }, + { url = "https://files.pythonhosted.org/packages/f8/aa/7af4e81f7acba21a4c6be026da38fd2b872ca46226673c89a758ebdc4fd2/PyYAML-6.0.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:cc1c1159b3d456576af7a3e4d1ba7e6924cb39de8f67111c735f6fc832082774", size = 184612 }, + { url = "https://files.pythonhosted.org/packages/8b/62/b9faa998fd185f65c1371643678e4d58254add437edb764a08c5a98fb986/PyYAML-6.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1e2120ef853f59c7419231f3bf4e7021f1b936f6ebd222406c3b60212205d2ee", size = 172040 }, + { url = "https://files.pythonhosted.org/packages/ad/0c/c804f5f922a9a6563bab712d8dcc70251e8af811fce4524d57c2c0fd49a4/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5d225db5a45f21e78dd9358e58a98702a0302f2659a3c6cd320564b75b86f47c", size = 736829 }, + { url = "https://files.pythonhosted.org/packages/51/16/6af8d6a6b210c8e54f1406a6b9481febf9c64a3109c541567e35a49aa2e7/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5ac9328ec4831237bec75defaf839f7d4564be1e6b25ac710bd1a96321cc8317", size = 764167 }, + { url = "https://files.pythonhosted.org/packages/75/e4/2c27590dfc9992f73aabbeb9241ae20220bd9452df27483b6e56d3975cc5/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3ad2a3decf9aaba3d29c8f537ac4b243e36bef957511b4766cb0057d32b0be85", size = 762952 }, + { url = "https://files.pythonhosted.org/packages/9b/97/ecc1abf4a823f5ac61941a9c00fe501b02ac3ab0e373c3857f7d4b83e2b6/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:ff3824dc5261f50c9b0dfb3be22b4567a6f938ccce4587b38952d85fd9e9afe4", size = 735301 }, + { url = "https://files.pythonhosted.org/packages/45/73/0f49dacd6e82c9430e46f4a027baa4ca205e8b0a9dce1397f44edc23559d/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:797b4f722ffa07cc8d62053e4cff1486fa6dc094105d13fea7b1de7d8bf71c9e", size = 756638 }, + { url = "https://files.pythonhosted.org/packages/22/5f/956f0f9fc65223a58fbc14459bf34b4cc48dec52e00535c79b8db361aabd/PyYAML-6.0.2-cp311-cp311-win32.whl", hash = "sha256:11d8f3dd2b9c1207dcaf2ee0bbbfd5991f571186ec9cc78427ba5bd32afae4b5", size = 143850 }, + { url = "https://files.pythonhosted.org/packages/ed/23/8da0bbe2ab9dcdd11f4f4557ccaf95c10b9811b13ecced089d43ce59c3c8/PyYAML-6.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:e10ce637b18caea04431ce14fabcf5c64a1c61ec9c56b071a4b7ca131ca52d44", size = 161980 }, + { url = "https://files.pythonhosted.org/packages/86/0c/c581167fc46d6d6d7ddcfb8c843a4de25bdd27e4466938109ca68492292c/PyYAML-6.0.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:c70c95198c015b85feafc136515252a261a84561b7b1d51e3384e0655ddf25ab", size = 183873 }, + { url = "https://files.pythonhosted.org/packages/a8/0c/38374f5bb272c051e2a69281d71cba6fdb983413e6758b84482905e29a5d/PyYAML-6.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ce826d6ef20b1bc864f0a68340c8b3287705cae2f8b4b1d932177dcc76721725", size = 173302 }, + { url = "https://files.pythonhosted.org/packages/c3/93/9916574aa8c00aa06bbac729972eb1071d002b8e158bd0e83a3b9a20a1f7/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1f71ea527786de97d1a0cc0eacd1defc0985dcf6b3f17bb77dcfc8c34bec4dc5", size = 739154 }, + { url = "https://files.pythonhosted.org/packages/95/0f/b8938f1cbd09739c6da569d172531567dbcc9789e0029aa070856f123984/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9b22676e8097e9e22e36d6b7bda33190d0d400f345f23d4065d48f4ca7ae0425", size = 766223 }, + { url = "https://files.pythonhosted.org/packages/b9/2b/614b4752f2e127db5cc206abc23a8c19678e92b23c3db30fc86ab731d3bd/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:80bab7bfc629882493af4aa31a4cfa43a4c57c83813253626916b8c7ada83476", size = 767542 }, + { url = "https://files.pythonhosted.org/packages/d4/00/dd137d5bcc7efea1836d6264f049359861cf548469d18da90cd8216cf05f/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:0833f8694549e586547b576dcfaba4a6b55b9e96098b36cdc7ebefe667dfed48", size = 731164 }, + { url = "https://files.pythonhosted.org/packages/c9/1f/4f998c900485e5c0ef43838363ba4a9723ac0ad73a9dc42068b12aaba4e4/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8b9c7197f7cb2738065c481a0461e50ad02f18c78cd75775628afb4d7137fb3b", size = 756611 }, + { url = "https://files.pythonhosted.org/packages/df/d1/f5a275fdb252768b7a11ec63585bc38d0e87c9e05668a139fea92b80634c/PyYAML-6.0.2-cp312-cp312-win32.whl", hash = "sha256:ef6107725bd54b262d6dedcc2af448a266975032bc85ef0172c5f059da6325b4", size = 140591 }, + { url = "https://files.pythonhosted.org/packages/0c/e8/4f648c598b17c3d06e8753d7d13d57542b30d56e6c2dedf9c331ae56312e/PyYAML-6.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:7e7401d0de89a9a855c839bc697c079a4af81cf878373abd7dc625847d25cbd8", size = 156338 }, + { url = "https://files.pythonhosted.org/packages/ef/e3/3af305b830494fa85d95f6d95ef7fa73f2ee1cc8ef5b495c7c3269fb835f/PyYAML-6.0.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:efdca5630322a10774e8e98e1af481aad470dd62c3170801852d752aa7a783ba", size = 181309 }, + { url = "https://files.pythonhosted.org/packages/45/9f/3b1c20a0b7a3200524eb0076cc027a970d320bd3a6592873c85c92a08731/PyYAML-6.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:50187695423ffe49e2deacb8cd10510bc361faac997de9efef88badc3bb9e2d1", size = 171679 }, + { url = "https://files.pythonhosted.org/packages/7c/9a/337322f27005c33bcb656c655fa78325b730324c78620e8328ae28b64d0c/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0ffe8360bab4910ef1b9e87fb812d8bc0a308b0d0eef8c8f44e0254ab3b07133", size = 733428 }, + { url = "https://files.pythonhosted.org/packages/a3/69/864fbe19e6c18ea3cc196cbe5d392175b4cf3d5d0ac1403ec3f2d237ebb5/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:17e311b6c678207928d649faa7cb0d7b4c26a0ba73d41e99c4fff6b6c3276484", size = 763361 }, + { url = "https://files.pythonhosted.org/packages/04/24/b7721e4845c2f162d26f50521b825fb061bc0a5afcf9a386840f23ea19fa/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:70b189594dbe54f75ab3a1acec5f1e3faa7e8cf2f1e08d9b561cb41b845f69d5", size = 759523 }, + { url = "https://files.pythonhosted.org/packages/2b/b2/e3234f59ba06559c6ff63c4e10baea10e5e7df868092bf9ab40e5b9c56b6/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:41e4e3953a79407c794916fa277a82531dd93aad34e29c2a514c2c0c5fe971cc", size = 726660 }, + { url = "https://files.pythonhosted.org/packages/fe/0f/25911a9f080464c59fab9027482f822b86bf0608957a5fcc6eaac85aa515/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:68ccc6023a3400877818152ad9a1033e3db8625d899c72eacb5a668902e4d652", size = 751597 }, + { url = "https://files.pythonhosted.org/packages/14/0d/e2c3b43bbce3cf6bd97c840b46088a3031085179e596d4929729d8d68270/PyYAML-6.0.2-cp313-cp313-win32.whl", hash = "sha256:bc2fa7c6b47d6bc618dd7fb02ef6fdedb1090ec036abab80d4681424b84c1183", size = 140527 }, + { url = "https://files.pythonhosted.org/packages/fa/de/02b54f42487e3d3c6efb3f89428677074ca7bf43aae402517bc7cca949f3/PyYAML-6.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563", size = 156446 }, + { url = "https://files.pythonhosted.org/packages/74/d9/323a59d506f12f498c2097488d80d16f4cf965cee1791eab58b56b19f47a/PyYAML-6.0.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:24471b829b3bf607e04e88d79542a9d48bb037c2267d7927a874e6c205ca7e9a", size = 183218 }, + { url = "https://files.pythonhosted.org/packages/74/cc/20c34d00f04d785f2028737e2e2a8254e1425102e730fee1d6396f832577/PyYAML-6.0.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d7fded462629cfa4b685c5416b949ebad6cec74af5e2d42905d41e257e0869f5", size = 728067 }, + { url = "https://files.pythonhosted.org/packages/20/52/551c69ca1501d21c0de51ddafa8c23a0191ef296ff098e98358f69080577/PyYAML-6.0.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d84a1718ee396f54f3a086ea0a66d8e552b2ab2017ef8b420e92edbc841c352d", size = 757812 }, + { url = "https://files.pythonhosted.org/packages/fd/7f/2c3697bba5d4aa5cc2afe81826d73dfae5f049458e44732c7a0938baa673/PyYAML-6.0.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9056c1ecd25795207ad294bcf39f2db3d845767be0ea6e6a34d856f006006083", size = 746531 }, + { url = "https://files.pythonhosted.org/packages/8c/ab/6226d3df99900e580091bb44258fde77a8433511a86883bd4681ea19a858/PyYAML-6.0.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:82d09873e40955485746739bcb8b4586983670466c23382c19cffecbf1fd8706", size = 800820 }, + { url = "https://files.pythonhosted.org/packages/a0/99/a9eb0f3e710c06c5d922026f6736e920d431812ace24aae38228d0d64b04/PyYAML-6.0.2-cp38-cp38-win32.whl", hash = "sha256:43fa96a3ca0d6b1812e01ced1044a003533c47f6ee8aca31724f78e93ccc089a", size = 145514 }, + { url = "https://files.pythonhosted.org/packages/75/8a/ee831ad5fafa4431099aa4e078d4c8efd43cd5e48fbc774641d233b683a9/PyYAML-6.0.2-cp38-cp38-win_amd64.whl", hash = "sha256:01179a4a8559ab5de078078f37e5c1a30d76bb88519906844fd7bdea1b7729ff", size = 162702 }, + { url = "https://files.pythonhosted.org/packages/65/d8/b7a1db13636d7fb7d4ff431593c510c8b8fca920ade06ca8ef20015493c5/PyYAML-6.0.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:688ba32a1cffef67fd2e9398a2efebaea461578b0923624778664cc1c914db5d", size = 184777 }, + { url = "https://files.pythonhosted.org/packages/0a/02/6ec546cd45143fdf9840b2c6be8d875116a64076218b61d68e12548e5839/PyYAML-6.0.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a8786accb172bd8afb8be14490a16625cbc387036876ab6ba70912730faf8e1f", size = 172318 }, + { url = "https://files.pythonhosted.org/packages/0e/9a/8cc68be846c972bda34f6c2a93abb644fb2476f4dcc924d52175786932c9/PyYAML-6.0.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d8e03406cac8513435335dbab54c0d385e4a49e4945d2909a581c83647ca0290", size = 720891 }, + { url = "https://files.pythonhosted.org/packages/e9/6c/6e1b7f40181bc4805e2e07f4abc10a88ce4648e7e95ff1abe4ae4014a9b2/PyYAML-6.0.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f753120cb8181e736c57ef7636e83f31b9c0d1722c516f7e86cf15b7aa57ff12", size = 722614 }, + { url = "https://files.pythonhosted.org/packages/3d/32/e7bd8535d22ea2874cef6a81021ba019474ace0d13a4819c2a4bce79bd6a/PyYAML-6.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3b1fdb9dc17f5a7677423d508ab4f243a726dea51fa5e70992e59a7411c89d19", size = 737360 }, + { url = "https://files.pythonhosted.org/packages/d7/12/7322c1e30b9be969670b672573d45479edef72c9a0deac3bb2868f5d7469/PyYAML-6.0.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:0b69e4ce7a131fe56b7e4d770c67429700908fc0752af059838b1cfb41960e4e", size = 699006 }, + { url = "https://files.pythonhosted.org/packages/82/72/04fcad41ca56491995076630c3ec1e834be241664c0c09a64c9a2589b507/PyYAML-6.0.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:a9f8c2e67970f13b16084e04f134610fd1d374bf477b17ec1599185cf611d725", size = 723577 }, + { url = "https://files.pythonhosted.org/packages/ed/5e/46168b1f2757f1fcd442bc3029cd8767d88a98c9c05770d8b420948743bb/PyYAML-6.0.2-cp39-cp39-win32.whl", hash = "sha256:6395c297d42274772abc367baaa79683958044e5d3835486c16da75d2a694631", size = 144593 }, + { url = "https://files.pythonhosted.org/packages/19/87/5124b1c1f2412bb95c59ec481eaf936cd32f0fe2a7b16b97b81c4c017a6a/PyYAML-6.0.2-cp39-cp39-win_amd64.whl", hash = "sha256:39693e1f8320ae4f43943590b49779ffb98acb81f788220ea932a6b6c51004d8", size = 162312 }, +] + +[[package]] +name = "readme-renderer" +version = "43.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "docutils" }, + { name = "nh3" }, + { name = "pygments" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/fe/b5/536c775084d239df6345dccf9b043419c7e3308bc31be4c7882196abc62e/readme_renderer-43.0.tar.gz", hash = "sha256:1818dd28140813509eeed8d62687f7cd4f7bad90d4db586001c5dc09d4fde311", size = 31768 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/45/be/3ea20dc38b9db08387cf97997a85a7d51527ea2057d71118feb0aa8afa55/readme_renderer-43.0-py3-none-any.whl", hash = "sha256:19db308d86ecd60e5affa3b2a98f017af384678c63c88e5d4556a380e674f3f9", size = 13301 }, +] + +[[package]] +name = "referencing" +version = "0.35.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "attrs" }, + { name = "rpds-py" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/99/5b/73ca1f8e72fff6fa52119dbd185f73a907b1989428917b24cff660129b6d/referencing-0.35.1.tar.gz", hash = "sha256:25b42124a6c8b632a425174f24087783efb348a6f1e0008e63cd4466fedf703c", size = 62991 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b7/59/2056f61236782a2c86b33906c025d4f4a0b17be0161b63b70fd9e8775d36/referencing-0.35.1-py3-none-any.whl", hash = "sha256:eda6d3234d62814d1c64e305c1331c9a3a6132da475ab6382eaa997b21ee75de", size = 26684 }, +] + +[[package]] +name = "requests" +version = "2.32.3" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "certifi" }, + { name = "charset-normalizer" }, + { name = "idna" }, + { name = "urllib3" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/63/70/2bf7780ad2d390a8d301ad0b550f1581eadbd9a20f896afe06353c2a2913/requests-2.32.3.tar.gz", hash = "sha256:55365417734eb18255590a9ff9eb97e9e1da868d4ccd6402399eaf68af20a760", size = 131218 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl", hash = "sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6", size = 64928 }, +] + +[[package]] +name = "requests-toolbelt" +version = "1.0.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "requests" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/f3/61/d7545dafb7ac2230c70d38d31cbfe4cc64f7144dc41f6e4e4b78ecd9f5bb/requests-toolbelt-1.0.0.tar.gz", hash = "sha256:7681a0a3d047012b5bdc0ee37d7f8f07ebe76ab08caeccfc3921ce23c88d5bc6", size = 206888 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/3f/51/d4db610ef29373b879047326cbf6fa98b6c1969d6f6dc423279de2b1be2c/requests_toolbelt-1.0.0-py2.py3-none-any.whl", hash = "sha256:cccfdd665f0a24fcf4726e690f65639d272bb0637b9b92dfd91a5568ccf6bd06", size = 54481 }, +] + +[[package]] +name = "rfc3986" +version = "2.0.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/85/40/1520d68bfa07ab5a6f065a186815fb6610c86fe957bc065754e47f7b0840/rfc3986-2.0.0.tar.gz", hash = "sha256:97aacf9dbd4bfd829baad6e6309fa6573aaf1be3f6fa735c8ab05e46cecb261c", size = 49026 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ff/9a/9afaade874b2fa6c752c36f1548f718b5b83af81ed9b76628329dab81c1b/rfc3986-2.0.0-py2.py3-none-any.whl", hash = "sha256:50b1502b60e289cb37883f3dfd34532b8873c7de9f49bb546641ce9cbd256ebd", size = 31326 }, +] + +[[package]] +name = "rich" +version = "13.9.4" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "markdown-it-py" }, + { name = "pygments" }, + { name = "typing-extensions", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/ab/3a/0316b28d0761c6734d6bc14e770d85506c986c85ffb239e688eeaab2c2bc/rich-13.9.4.tar.gz", hash = "sha256:439594978a49a09530cff7ebc4b5c7103ef57baf48d5ea3184f21d9a2befa098", size = 223149 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/19/71/39c7c0d87f8d4e6c020a393182060eaefeeae6c01dab6a84ec346f2567df/rich-13.9.4-py3-none-any.whl", hash = "sha256:6049d5e6ec054bf2779ab3358186963bac2ea89175919d699e378b99738c2a90", size = 242424 }, +] + +[[package]] +name = "rich-click" +version = "1.8.3" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "click" }, + { name = "rich" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/3a/a9/a1f1af87e83832d794342fbc09c96cc7cd6798b8dfb8adfbe6ccbef8d70c/rich_click-1.8.3.tar.gz", hash = "sha256:6d75bdfa7aa9ed2c467789a0688bc6da23fbe3a143e19aa6ad3f8bac113d2ab3", size = 38209 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c6/ea/5a0c5a8e6532e971983d1b0fc99268eb66a10f489da35d9022ce01044191/rich_click-1.8.3-py3-none-any.whl", hash = "sha256:636d9c040d31c5eee242201b5bf4f2d358bfae4db14bb22ec1cafa717cfd02cd", size = 35032 }, +] + +[[package]] +name = "rpds-py" +version = "0.20.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/25/cb/8e919951f55d109d658f81c9b49d0cc3b48637c50792c5d2e77032b8c5da/rpds_py-0.20.1.tar.gz", hash = "sha256:e1791c4aabd117653530dccd24108fa03cc6baf21f58b950d0a73c3b3b29a350", size = 25931 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ae/0e/d7e7e9280988a7bc56fd326042baca27f4f55fad27dc8aa64e5e0e894e5d/rpds_py-0.20.1-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:a649dfd735fff086e8a9d0503a9f0c7d01b7912a333c7ae77e1515c08c146dad", size = 327335 }, + { url = "https://files.pythonhosted.org/packages/4c/72/027185f213d53ae66765c575229829b202fbacf3d55fe2bd9ff4e29bb157/rpds_py-0.20.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f16bc1334853e91ddaaa1217045dd7be166170beec337576818461268a3de67f", size = 318250 }, + { url = "https://files.pythonhosted.org/packages/2b/e7/b4eb3e6ff541c83d3b46f45f855547e412ab60c45bef64520fafb00b9b42/rpds_py-0.20.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:14511a539afee6f9ab492b543060c7491c99924314977a55c98bfa2ee29ce78c", size = 361206 }, + { url = "https://files.pythonhosted.org/packages/e7/80/cb9a4b4cad31bcaa37f38dae7a8be861f767eb2ca4f07a146b5ffcfbee09/rpds_py-0.20.1-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3ccb8ac2d3c71cda472b75af42818981bdacf48d2e21c36331b50b4f16930163", size = 369921 }, + { url = "https://files.pythonhosted.org/packages/95/1b/463b11e7039e18f9e778568dbf7338c29bbc1f8996381115201c668eb8c8/rpds_py-0.20.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c142b88039b92e7e0cb2552e8967077e3179b22359e945574f5e2764c3953dcf", size = 403673 }, + { url = "https://files.pythonhosted.org/packages/86/98/1ef4028e9d5b76470bf7f8f2459be07ac5c9621270a2a5e093f8d8a8cc2c/rpds_py-0.20.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f19169781dddae7478a32301b499b2858bc52fc45a112955e798ee307e294977", size = 430267 }, + { url = "https://files.pythonhosted.org/packages/25/8e/41d7e3e6d3a4a6c94375020477705a3fbb6515717901ab8f94821cf0a0d9/rpds_py-0.20.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:13c56de6518e14b9bf6edde23c4c39dac5b48dcf04160ea7bce8fca8397cdf86", size = 360569 }, + { url = "https://files.pythonhosted.org/packages/4f/6a/8839340464d4e1bbfaf0482e9d9165a2309c2c17427e4dcb72ce3e5cc5d6/rpds_py-0.20.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:925d176a549f4832c6f69fa6026071294ab5910e82a0fe6c6228fce17b0706bd", size = 382584 }, + { url = "https://files.pythonhosted.org/packages/64/96/7a7f938d3796a6a3ec08ed0e8a5ecd436fbd516a3684ab1fa22d46d6f6cc/rpds_py-0.20.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:78f0b6877bfce7a3d1ff150391354a410c55d3cdce386f862926a4958ad5ab7e", size = 546560 }, + { url = "https://files.pythonhosted.org/packages/15/c7/19fb4f1247a3c90a99eca62909bf76ee988f9b663e47878a673d9854ec5c/rpds_py-0.20.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:3dd645e2b0dcb0fd05bf58e2e54c13875847687d0b71941ad2e757e5d89d4356", size = 549359 }, + { url = "https://files.pythonhosted.org/packages/d2/4c/445eb597a39a883368ea2f341dd6e48a9d9681b12ebf32f38a827b30529b/rpds_py-0.20.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:4f676e21db2f8c72ff0936f895271e7a700aa1f8d31b40e4e43442ba94973899", size = 527567 }, + { url = "https://files.pythonhosted.org/packages/4f/71/4c44643bffbcb37311fc7fe221bcf139c8d660bc78f746dd3a05741372c8/rpds_py-0.20.1-cp310-none-win32.whl", hash = "sha256:648386ddd1e19b4a6abab69139b002bc49ebf065b596119f8f37c38e9ecee8ff", size = 200412 }, + { url = "https://files.pythonhosted.org/packages/f4/33/9d0529d74099e090ec9ab15eb0a049c56cca599eaaca71bfedbdbca656a9/rpds_py-0.20.1-cp310-none-win_amd64.whl", hash = "sha256:d9ecb51120de61e4604650666d1f2b68444d46ae18fd492245a08f53ad2b7711", size = 218563 }, + { url = "https://files.pythonhosted.org/packages/a0/2e/a6ded84019a05b8f23e0fe6a632f62ae438a8c5e5932d3dfc90c73418414/rpds_py-0.20.1-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:762703bdd2b30983c1d9e62b4c88664df4a8a4d5ec0e9253b0231171f18f6d75", size = 327194 }, + { url = "https://files.pythonhosted.org/packages/68/11/d3f84c69de2b2086be3d6bd5e9d172825c096b13842ab7e5f8f39f06035b/rpds_py-0.20.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:0b581f47257a9fce535c4567782a8976002d6b8afa2c39ff616edf87cbeff712", size = 318126 }, + { url = "https://files.pythonhosted.org/packages/18/c0/13f1bce9c901511e5e4c0b77a99dbb946bb9a177ca88c6b480e9cb53e304/rpds_py-0.20.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:842c19a6ce894493563c3bd00d81d5100e8e57d70209e84d5491940fdb8b9e3a", size = 361119 }, + { url = "https://files.pythonhosted.org/packages/06/31/3bd721575671f22a37476c2d7b9e34bfa5185bdcee09f7fedde3b29f3adb/rpds_py-0.20.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:42cbde7789f5c0bcd6816cb29808e36c01b960fb5d29f11e052215aa85497c93", size = 369532 }, + { url = "https://files.pythonhosted.org/packages/20/22/3eeb0385f33251b4fd0f728e6a3801dc8acc05e714eb7867cefe635bf4ab/rpds_py-0.20.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6c8e9340ce5a52f95fa7d3b552b35c7e8f3874d74a03a8a69279fd5fca5dc751", size = 403703 }, + { url = "https://files.pythonhosted.org/packages/10/e1/8dde6174e7ac5b9acd3269afca2e17719bc7e5088c68f44874d2ad9e4560/rpds_py-0.20.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8ba6f89cac95c0900d932c9efb7f0fb6ca47f6687feec41abcb1bd5e2bd45535", size = 429868 }, + { url = "https://files.pythonhosted.org/packages/19/51/a3cc1a5238acfc2582033e8934d034301f9d4931b9bf7c7ccfabc4ca0880/rpds_py-0.20.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4a916087371afd9648e1962e67403c53f9c49ca47b9680adbeef79da3a7811b0", size = 360539 }, + { url = "https://files.pythonhosted.org/packages/cd/8c/3c87471a44bd4114e2b0aec90f298f6caaac4e8db6af904d5dd2279f5c61/rpds_py-0.20.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:200a23239781f46149e6a415f1e870c5ef1e712939fe8fa63035cd053ac2638e", size = 382467 }, + { url = "https://files.pythonhosted.org/packages/d0/9b/95073fe3e0f130e6d561e106818b6568ef1f2df3352e7f162ab912da837c/rpds_py-0.20.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:58b1d5dd591973d426cbb2da5e27ba0339209832b2f3315928c9790e13f159e8", size = 546669 }, + { url = "https://files.pythonhosted.org/packages/de/4c/7ab3669e02bb06fedebcfd64d361b7168ba39dfdf385e4109440f2e7927b/rpds_py-0.20.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:6b73c67850ca7cae0f6c56f71e356d7e9fa25958d3e18a64927c2d930859b8e4", size = 549304 }, + { url = "https://files.pythonhosted.org/packages/f1/e8/ad5da336cd42adbdafe0ecd40dcecdae01fd3d703c621c7637615a008d3a/rpds_py-0.20.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:d8761c3c891cc51e90bc9926d6d2f59b27beaf86c74622c8979380a29cc23ac3", size = 527637 }, + { url = "https://files.pythonhosted.org/packages/02/f1/1b47b9e5b941c2659c9b7e4ef41b6f07385a6500c638fa10c066e4616ecb/rpds_py-0.20.1-cp311-none-win32.whl", hash = "sha256:cd945871335a639275eee904caef90041568ce3b42f402c6959b460d25ae8732", size = 200488 }, + { url = "https://files.pythonhosted.org/packages/85/f6/c751c1adfa31610055acfa1cc667cf2c2d7011a73070679c448cf5856905/rpds_py-0.20.1-cp311-none-win_amd64.whl", hash = "sha256:7e21b7031e17c6b0e445f42ccc77f79a97e2687023c5746bfb7a9e45e0921b84", size = 218475 }, + { url = "https://files.pythonhosted.org/packages/e7/10/4e8dcc08b58a548098dbcee67a4888751a25be7a6dde0a83d4300df48bfa/rpds_py-0.20.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:36785be22066966a27348444b40389f8444671630063edfb1a2eb04318721e17", size = 329749 }, + { url = "https://files.pythonhosted.org/packages/d2/e4/61144f3790e12fd89e6153d77f7915ad26779735fef8ee9c099cba6dfb4a/rpds_py-0.20.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:142c0a5124d9bd0e2976089484af5c74f47bd3298f2ed651ef54ea728d2ea42c", size = 321032 }, + { url = "https://files.pythonhosted.org/packages/fa/e0/99205aabbf3be29ef6c58ef9b08feed51ba6532fdd47461245cb58dd9897/rpds_py-0.20.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dbddc10776ca7ebf2a299c41a4dde8ea0d8e3547bfd731cb87af2e8f5bf8962d", size = 363931 }, + { url = "https://files.pythonhosted.org/packages/ac/bd/bce2dddb518b13a7e77eed4be234c9af0c9c6d403d01c5e6ae8eb447ab62/rpds_py-0.20.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:15a842bb369e00295392e7ce192de9dcbf136954614124a667f9f9f17d6a216f", size = 373343 }, + { url = "https://files.pythonhosted.org/packages/43/15/112b7c553066cb91264691ba7fb119579c440a0ae889da222fa6fc0d411a/rpds_py-0.20.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:be5ef2f1fc586a7372bfc355986226484e06d1dc4f9402539872c8bb99e34b01", size = 406304 }, + { url = "https://files.pythonhosted.org/packages/af/8d/2da52aef8ae5494a382b0c0025ba5b68f2952db0f2a4c7534580e8ca83cc/rpds_py-0.20.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:dbcf360c9e3399b056a238523146ea77eeb2a596ce263b8814c900263e46031a", size = 423022 }, + { url = "https://files.pythonhosted.org/packages/c8/1b/f23015cb293927c93bdb4b94a48bfe77ad9d57359c75db51f0ff0cf482ff/rpds_py-0.20.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ecd27a66740ffd621d20b9a2f2b5ee4129a56e27bfb9458a3bcc2e45794c96cb", size = 364937 }, + { url = "https://files.pythonhosted.org/packages/7b/8b/6da8636b2ea2e2f709e56656e663b6a71ecd9a9f9d9dc21488aade122026/rpds_py-0.20.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d0b937b2a1988f184a3e9e577adaa8aede21ec0b38320d6009e02bd026db04fa", size = 386301 }, + { url = "https://files.pythonhosted.org/packages/20/af/2ae192797bffd0d6d558145b5a36e7245346ff3e44f6ddcb82f0eb8512d4/rpds_py-0.20.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:6889469bfdc1eddf489729b471303739bf04555bb151fe8875931f8564309afc", size = 549452 }, + { url = "https://files.pythonhosted.org/packages/07/dd/9f6520712a5108cd7d407c9db44a3d59011b385c58e320d58ebf67757a9e/rpds_py-0.20.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:19b73643c802f4eaf13d97f7855d0fb527fbc92ab7013c4ad0e13a6ae0ed23bd", size = 554370 }, + { url = "https://files.pythonhosted.org/packages/5e/0e/b1bdc7ea0db0946d640ab8965146099093391bb5d265832994c47461e3c5/rpds_py-0.20.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:3c6afcf2338e7f374e8edc765c79fbcb4061d02b15dd5f8f314a4af2bdc7feb5", size = 530940 }, + { url = "https://files.pythonhosted.org/packages/ae/d3/ffe907084299484fab60a7955f7c0e8a295c04249090218c59437010f9f4/rpds_py-0.20.1-cp312-none-win32.whl", hash = "sha256:dc73505153798c6f74854aba69cc75953888cf9866465196889c7cdd351e720c", size = 203164 }, + { url = "https://files.pythonhosted.org/packages/1f/ba/9cbb57423c4bfbd81c473913bebaed151ad4158ee2590a4e4b3e70238b48/rpds_py-0.20.1-cp312-none-win_amd64.whl", hash = "sha256:8bbe951244a838a51289ee53a6bae3a07f26d4e179b96fc7ddd3301caf0518eb", size = 220750 }, + { url = "https://files.pythonhosted.org/packages/b5/01/fee2e1d1274c92fff04aa47d805a28d62c2aa971d1f49f5baea1c6e670d9/rpds_py-0.20.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:6ca91093a4a8da4afae7fe6a222c3b53ee4eef433ebfee4d54978a103435159e", size = 329359 }, + { url = "https://files.pythonhosted.org/packages/b0/cf/4aeffb02b7090029d7aeecbffb9a10e1c80f6f56d7e9a30e15481dc4099c/rpds_py-0.20.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:b9c2fe36d1f758b28121bef29ed1dee9b7a2453e997528e7d1ac99b94892527c", size = 320543 }, + { url = "https://files.pythonhosted.org/packages/17/69/85cf3429e9ccda684ba63ff36b5866d5f9451e921cc99819341e19880334/rpds_py-0.20.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f009c69bc8c53db5dfab72ac760895dc1f2bc1b62ab7408b253c8d1ec52459fc", size = 363107 }, + { url = "https://files.pythonhosted.org/packages/ef/de/7df88dea9c3eeb832196d23b41f0f6fc5f9a2ee9b2080bbb1db8731ead9c/rpds_py-0.20.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:6740a3e8d43a32629bb9b009017ea5b9e713b7210ba48ac8d4cb6d99d86c8ee8", size = 372027 }, + { url = "https://files.pythonhosted.org/packages/d1/b8/88675399d2038580743c570a809c43a900e7090edc6553f8ffb66b23c965/rpds_py-0.20.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:32b922e13d4c0080d03e7b62991ad7f5007d9cd74e239c4b16bc85ae8b70252d", size = 405031 }, + { url = "https://files.pythonhosted.org/packages/e1/aa/cca639f6d17caf00bab51bdc70fcc0bdda3063e5662665c4fdf60443c474/rpds_py-0.20.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:fe00a9057d100e69b4ae4a094203a708d65b0f345ed546fdef86498bf5390982", size = 422271 }, + { url = "https://files.pythonhosted.org/packages/c4/07/bf8a949d2ec4626c285579c9d6b356c692325f1a4126e947736b416e1fc4/rpds_py-0.20.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:49fe9b04b6fa685bd39237d45fad89ba19e9163a1ccaa16611a812e682913496", size = 363625 }, + { url = "https://files.pythonhosted.org/packages/11/f0/06675c6a58d6ce34547879138810eb9aab0c10e5607ea6c2e4dc56b703c8/rpds_py-0.20.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:aa7ac11e294304e615b43f8c441fee5d40094275ed7311f3420d805fde9b07b4", size = 385906 }, + { url = "https://files.pythonhosted.org/packages/bf/ac/2d1f50374eb8e41030fad4e87f81751e1c39e3b5d4bee8c5618830d8a6ac/rpds_py-0.20.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:6aa97af1558a9bef4025f8f5d8c60d712e0a3b13a2fe875511defc6ee77a1ab7", size = 549021 }, + { url = "https://files.pythonhosted.org/packages/f7/d4/a7d70a7cc71df772eeadf4bce05e32e780a9fe44a511a5b091c7a85cb767/rpds_py-0.20.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:483b29f6f7ffa6af845107d4efe2e3fa8fb2693de8657bc1849f674296ff6a5a", size = 553800 }, + { url = "https://files.pythonhosted.org/packages/87/81/dc30bc449ccba63ad23a0f6633486d4e0e6955f45f3715a130dacabd6ad0/rpds_py-0.20.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:37fe0f12aebb6a0e3e17bb4cd356b1286d2d18d2e93b2d39fe647138458b4bcb", size = 531076 }, + { url = "https://files.pythonhosted.org/packages/50/80/fb62ab48f3b5cfe704ead6ad372da1922ddaa76397055e02eb507054c979/rpds_py-0.20.1-cp313-none-win32.whl", hash = "sha256:a624cc00ef2158e04188df5e3016385b9353638139a06fb77057b3498f794782", size = 202804 }, + { url = "https://files.pythonhosted.org/packages/d9/30/a3391e76d0b3313f33bdedd394a519decae3a953d2943e3dabf80ae32447/rpds_py-0.20.1-cp313-none-win_amd64.whl", hash = "sha256:b71b8666eeea69d6363248822078c075bac6ed135faa9216aa85f295ff009b1e", size = 220502 }, + { url = "https://files.pythonhosted.org/packages/53/ef/b1883734ea0cd9996de793cdc38c32a28143b04911d1e570090acd8a9162/rpds_py-0.20.1-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:5b48e790e0355865197ad0aca8cde3d8ede347831e1959e158369eb3493d2191", size = 327757 }, + { url = "https://files.pythonhosted.org/packages/54/63/47d34dc4ddb3da73e78e10c9009dcf8edc42d355a221351c05c822c2a50b/rpds_py-0.20.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:3e310838a5801795207c66c73ea903deda321e6146d6f282e85fa7e3e4854804", size = 318785 }, + { url = "https://files.pythonhosted.org/packages/f7/e1/d6323be4afbe3013f28725553b7bfa80b3f013f91678af258f579f8ea8f9/rpds_py-0.20.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2249280b870e6a42c0d972339e9cc22ee98730a99cd7f2f727549af80dd5a963", size = 361511 }, + { url = "https://files.pythonhosted.org/packages/ab/d3/c40e4d9ecd571f0f50fe69bc53fe608d7b2c49b30738b480044990260838/rpds_py-0.20.1-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e79059d67bea28b53d255c1437b25391653263f0e69cd7dec170d778fdbca95e", size = 370201 }, + { url = "https://files.pythonhosted.org/packages/f1/b6/96a4a9977a8a06c2c49d90aa571346aff1642abf15066a39a0b4817bf049/rpds_py-0.20.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2b431c777c9653e569986ecf69ff4a5dba281cded16043d348bf9ba505486f36", size = 403866 }, + { url = "https://files.pythonhosted.org/packages/cd/8f/702b52287949314b498a311f92b5ee0ba30c702a27e0e6b560e2da43b8d5/rpds_py-0.20.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:da584ff96ec95e97925174eb8237e32f626e7a1a97888cdd27ee2f1f24dd0ad8", size = 430163 }, + { url = "https://files.pythonhosted.org/packages/c4/ce/af016c81fda833bf125b20d1677d816f230cad2ab189f46bcbfea3c7a375/rpds_py-0.20.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:02a0629ec053fc013808a85178524e3cb63a61dbc35b22499870194a63578fb9", size = 360776 }, + { url = "https://files.pythonhosted.org/packages/08/a7/988e179c9bef55821abe41762228d65077e0570ca75c9efbcd1bc6e263b4/rpds_py-0.20.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fbf15aff64a163db29a91ed0868af181d6f68ec1a3a7d5afcfe4501252840bad", size = 383008 }, + { url = "https://files.pythonhosted.org/packages/96/b0/e4077f7f1b9622112ae83254aedfb691490278793299bc06dcf54ec8c8e4/rpds_py-0.20.1-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:07924c1b938798797d60c6308fa8ad3b3f0201802f82e4a2c41bb3fafb44cc28", size = 546371 }, + { url = "https://files.pythonhosted.org/packages/e4/5e/1d4dd08ec0352cfe516ea93ea1993c2f656f893c87dafcd9312bd07f65f7/rpds_py-0.20.1-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:4a5a844f68776a7715ecb30843b453f07ac89bad393431efbf7accca3ef599c1", size = 549809 }, + { url = "https://files.pythonhosted.org/packages/57/ac/a716b4729ff23ec034b7d2ff76a86e6f0753c4098401bdfdf55b2efe90e6/rpds_py-0.20.1-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:518d2ca43c358929bf08f9079b617f1c2ca6e8848f83c1225c88caeac46e6cbc", size = 528492 }, + { url = "https://files.pythonhosted.org/packages/e0/ed/a0b58a9ecef79918169eacdabd14eb4c5c86ce71184ed56b80c6eb425828/rpds_py-0.20.1-cp38-none-win32.whl", hash = "sha256:3aea7eed3e55119635a74bbeb80b35e776bafccb70d97e8ff838816c124539f1", size = 200512 }, + { url = "https://files.pythonhosted.org/packages/5f/c3/222e25124283afc76c473fcd2c547e82ec57683fa31cb4d6c6eb44e5d57a/rpds_py-0.20.1-cp38-none-win_amd64.whl", hash = "sha256:7dca7081e9a0c3b6490a145593f6fe3173a94197f2cb9891183ef75e9d64c425", size = 218627 }, + { url = "https://files.pythonhosted.org/packages/d6/87/e7e0fcbfdc0d0e261534bcc885f6ae6253095b972e32f8b8b1278c78a2a9/rpds_py-0.20.1-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:b41b6321805c472f66990c2849e152aff7bc359eb92f781e3f606609eac877ad", size = 327867 }, + { url = "https://files.pythonhosted.org/packages/93/a0/17836b7961fc82586e9b818abdee2a27e2e605a602bb8c0d43f02092f8c2/rpds_py-0.20.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:0a90c373ea2975519b58dece25853dbcb9779b05cc46b4819cb1917e3b3215b6", size = 318893 }, + { url = "https://files.pythonhosted.org/packages/dc/03/deb81d8ea3a8b974e7b03cfe8c8c26616ef8f4980dd430d8dd0a2f1b4d8e/rpds_py-0.20.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:16d4477bcb9fbbd7b5b0e4a5d9b493e42026c0bf1f06f723a9353f5153e75d30", size = 361664 }, + { url = "https://files.pythonhosted.org/packages/16/49/d9938603731745c7b6babff97ca61ff3eb4619e7128b5ab0111ad4e91d6d/rpds_py-0.20.1-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:84b8382a90539910b53a6307f7c35697bc7e6ffb25d9c1d4e998a13e842a5e83", size = 369796 }, + { url = "https://files.pythonhosted.org/packages/87/d2/480b36c69cdc373853401b6aab6a281cf60f6d72b1545d82c0d23d9dd77c/rpds_py-0.20.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4888e117dd41b9d34194d9e31631af70d3d526efc363085e3089ab1a62c32ed1", size = 403860 }, + { url = "https://files.pythonhosted.org/packages/31/7c/f6d909cb57761293308dbef14f1663d84376f2e231892a10aafc57b42037/rpds_py-0.20.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5265505b3d61a0f56618c9b941dc54dc334dc6e660f1592d112cd103d914a6db", size = 430793 }, + { url = "https://files.pythonhosted.org/packages/d4/62/c9bd294c4b5f84d9cc2c387b548ae53096ad7e71ac5b02b6310e9dc85aa4/rpds_py-0.20.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e75ba609dba23f2c95b776efb9dd3f0b78a76a151e96f96cc5b6b1b0004de66f", size = 360927 }, + { url = "https://files.pythonhosted.org/packages/c1/a7/15d927d83a44da8307a432b1cac06284b6657706d099a98cc99fec34ad51/rpds_py-0.20.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:1791ff70bc975b098fe6ecf04356a10e9e2bd7dc21fa7351c1742fdeb9b4966f", size = 382660 }, + { url = "https://files.pythonhosted.org/packages/4c/28/0630719c18456238bb07d59c4302fed50a13aa8035ec23dbfa80d116f9bc/rpds_py-0.20.1-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:d126b52e4a473d40232ec2052a8b232270ed1f8c9571aaf33f73a14cc298c24f", size = 546888 }, + { url = "https://files.pythonhosted.org/packages/b9/75/3c9bda11b9c15d680b315f898af23825159314d4b56568f24b53ace8afcd/rpds_py-0.20.1-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:c14937af98c4cc362a1d4374806204dd51b1e12dded1ae30645c298e5a5c4cb1", size = 550088 }, + { url = "https://files.pythonhosted.org/packages/70/f1/8fe7d04c194218171220a412057429defa9e2da785de0777c4d39309337e/rpds_py-0.20.1-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:3d089d0b88996df627693639d123c8158cff41c0651f646cd8fd292c7da90eaf", size = 528270 }, + { url = "https://files.pythonhosted.org/packages/d6/62/41b0020f4b00af042b008e679dbe25a2f5bce655139a81f8b812f9068e52/rpds_py-0.20.1-cp39-none-win32.whl", hash = "sha256:653647b8838cf83b2e7e6a0364f49af96deec64d2a6578324db58380cff82aca", size = 200658 }, + { url = "https://files.pythonhosted.org/packages/05/01/e64bb8889f2dcc951e53de33d8b8070456397ae4e10edc35e6cb9908f5c8/rpds_py-0.20.1-cp39-none-win_amd64.whl", hash = "sha256:fa41a64ac5b08b292906e248549ab48b69c5428f3987b09689ab2441f267d04d", size = 218883 }, + { url = "https://files.pythonhosted.org/packages/b6/fa/7959429e69569d0f6e7d27f80451402da0409349dd2b07f6bcbdd5fad2d3/rpds_py-0.20.1-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:7a07ced2b22f0cf0b55a6a510078174c31b6d8544f3bc00c2bcee52b3d613f74", size = 328209 }, + { url = "https://files.pythonhosted.org/packages/25/97/5dfdb091c30267ff404d2fd9e70c7a6d6ffc65ca77fffe9456e13b719066/rpds_py-0.20.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:68cb0a499f2c4a088fd2f521453e22ed3527154136a855c62e148b7883b99f9a", size = 319499 }, + { url = "https://files.pythonhosted.org/packages/7c/98/cf2608722400f5f9bb4c82aa5ac09026f3ac2ebea9d4059d3533589ed0b6/rpds_py-0.20.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fa3060d885657abc549b2a0f8e1b79699290e5d83845141717c6c90c2df38311", size = 361795 }, + { url = "https://files.pythonhosted.org/packages/89/de/0e13dd43c785c60e63933e96fbddda0b019df6862f4d3019bb49c3861131/rpds_py-0.20.1-pp310-pypy310_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:95f3b65d2392e1c5cec27cff08fdc0080270d5a1a4b2ea1d51d5f4a2620ff08d", size = 370604 }, + { url = "https://files.pythonhosted.org/packages/8a/fc/fe3c83c77f82b8059eeec4e998064913d66212b69b3653df48f58ad33d3d/rpds_py-0.20.1-pp310-pypy310_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2cc3712a4b0b76a1d45a9302dd2f53ff339614b1c29603a911318f2357b04dd2", size = 404177 }, + { url = "https://files.pythonhosted.org/packages/94/30/5189518bfb80a41f664daf32b46645c7fbdcc89028a0f1bfa82e806e0fbb/rpds_py-0.20.1-pp310-pypy310_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5d4eea0761e37485c9b81400437adb11c40e13ef513375bbd6973e34100aeb06", size = 430108 }, + { url = "https://files.pythonhosted.org/packages/67/0e/6f069feaff5c298375cd8c55e00ecd9bd79c792ce0893d39448dc0097857/rpds_py-0.20.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7f5179583d7a6cdb981151dd349786cbc318bab54963a192692d945dd3f6435d", size = 361184 }, + { url = "https://files.pythonhosted.org/packages/27/9f/ce3e2ae36f392c3ef1988c06e9e0b4c74f64267dad7c223003c34da11adb/rpds_py-0.20.1-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2fbb0ffc754490aff6dabbf28064be47f0f9ca0b9755976f945214965b3ace7e", size = 384140 }, + { url = "https://files.pythonhosted.org/packages/5f/d5/89d44504d0bc7a1135062cb520a17903ff002f458371b8d9160af3b71e52/rpds_py-0.20.1-pp310-pypy310_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:a94e52537a0e0a85429eda9e49f272ada715506d3b2431f64b8a3e34eb5f3e75", size = 546589 }, + { url = "https://files.pythonhosted.org/packages/8f/8f/e1c2db4fcca3947d9a28ec9553700b4dc8038f0eff575f579e75885b0661/rpds_py-0.20.1-pp310-pypy310_pp73-musllinux_1_2_i686.whl", hash = "sha256:92b68b79c0da2a980b1c4197e56ac3dd0c8a149b4603747c4378914a68706979", size = 550059 }, + { url = "https://files.pythonhosted.org/packages/67/29/00a9e986df36721b5def82fff60995c1ee8827a7d909a6ec8929fb4cc668/rpds_py-0.20.1-pp310-pypy310_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:93da1d3db08a827eda74356f9f58884adb254e59b6664f64cc04cdff2cc19b0d", size = 529131 }, + { url = "https://files.pythonhosted.org/packages/a3/32/95364440560ec476b19c6a2704259e710c223bf767632ebaa72cc2a1760f/rpds_py-0.20.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:754bbed1a4ca48479e9d4182a561d001bbf81543876cdded6f695ec3d465846b", size = 219677 }, + { url = "https://files.pythonhosted.org/packages/ed/bf/ad8492e972c90a3d48a38e2b5095c51a8399d5b57e83f2d5d1649490f72b/rpds_py-0.20.1-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:ca449520e7484534a2a44faf629362cae62b660601432d04c482283c47eaebab", size = 328046 }, + { url = "https://files.pythonhosted.org/packages/75/fd/84f42386165d6d555acb76c6d39c90b10c9dcf25116daf4f48a0a9d6867a/rpds_py-0.20.1-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:9c4cb04a16b0f199a8c9bf807269b2f63b7b5b11425e4a6bd44bd6961d28282c", size = 319306 }, + { url = "https://files.pythonhosted.org/packages/6c/8a/abcd5119a0573f9588ad71a3fde3c07ddd1d1393cfee15a6ba7495c256f1/rpds_py-0.20.1-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bb63804105143c7e24cee7db89e37cb3f3941f8e80c4379a0b355c52a52b6780", size = 362558 }, + { url = "https://files.pythonhosted.org/packages/9d/65/1c2bb10afd4bd32800227a658ae9097bc1d08a4e5048a57a9bd2efdf6306/rpds_py-0.20.1-pp39-pypy39_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:55cd1fa4ecfa6d9f14fbd97ac24803e6f73e897c738f771a9fe038f2f11ff07c", size = 370811 }, + { url = "https://files.pythonhosted.org/packages/6c/ee/f4bab2b9e51ced30351cfd210647885391463ae682028c79760e7db28e4e/rpds_py-0.20.1-pp39-pypy39_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0f8f741b6292c86059ed175d80eefa80997125b7c478fb8769fd9ac8943a16c0", size = 404660 }, + { url = "https://files.pythonhosted.org/packages/48/0f/9d04d0939682f0c97be827fc51ff986555ffb573e6781bd5132441f0ce25/rpds_py-0.20.1-pp39-pypy39_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0fc212779bf8411667234b3cdd34d53de6c2b8b8b958e1e12cb473a5f367c338", size = 430490 }, + { url = "https://files.pythonhosted.org/packages/0d/f2/e9b90fd8416d59941b6a12f2c2e1d898b63fd092f5a7a6f98236cb865764/rpds_py-0.20.1-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0ad56edabcdb428c2e33bbf24f255fe2b43253b7d13a2cdbf05de955217313e6", size = 361448 }, + { url = "https://files.pythonhosted.org/packages/0b/83/1cc776dce7bedb17d6f4ea62eafccee8a57a4678f4fac414ab69fb9b6b0b/rpds_py-0.20.1-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:0a3a1e9ee9728b2c1734f65d6a1d376c6f2f6fdcc13bb007a08cc4b1ff576dc5", size = 383681 }, + { url = "https://files.pythonhosted.org/packages/17/5c/e0cdd6b0a8373fdef3667af2778dd9ff3abf1bbb9c7bd92c603c91440eb0/rpds_py-0.20.1-pp39-pypy39_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:e13de156137b7095442b288e72f33503a469aa1980ed856b43c353ac86390519", size = 546203 }, + { url = "https://files.pythonhosted.org/packages/1b/a8/81fc9cbc01e7ef6d10652aedc1de4e8473934773e2808ba49094e03575df/rpds_py-0.20.1-pp39-pypy39_pp73-musllinux_1_2_i686.whl", hash = "sha256:07f59760ef99f31422c49038964b31c4dfcfeb5d2384ebfc71058a7c9adae2d2", size = 549855 }, + { url = "https://files.pythonhosted.org/packages/b3/87/99648693d3c1bbce088119bc61ecaab62e5f9c713894edc604ffeca5ae88/rpds_py-0.20.1-pp39-pypy39_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:59240685e7da61fb78f65a9f07f8108e36a83317c53f7b276b4175dc44151684", size = 528625 }, + { url = "https://files.pythonhosted.org/packages/05/c3/10c68a08849f1fa45d205e54141fa75d316013e3d701ef01770ee1220bb8/rpds_py-0.20.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:83cba698cfb3c2c5a7c3c6bac12fe6c6a51aae69513726be6411076185a8b24a", size = 219991 }, +] + +[[package]] +name = "secretstorage" +version = "3.3.3" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "cryptography" }, + { name = "jeepney" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/53/a4/f48c9d79cb507ed1373477dbceaba7401fd8a23af63b837fa61f1dcd3691/SecretStorage-3.3.3.tar.gz", hash = "sha256:2403533ef369eca6d2ba81718576c5e0f564d5cca1b58f73a8b23e7d4eeebd77", size = 19739 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/54/24/b4293291fa1dd830f353d2cb163295742fa87f179fcc8a20a306a81978b7/SecretStorage-3.3.3-py3-none-any.whl", hash = "sha256:f356e6628222568e3af06f2eba8df495efa13b3b63081dafd4f7d9a7b7bc9f99", size = 15221 }, +] + +[[package]] +name = "semver" +version = "3.0.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/41/6c/a536cc008f38fd83b3c1b98ce19ead13b746b5588c9a0cb9dd9f6ea434bc/semver-3.0.2.tar.gz", hash = "sha256:6253adb39c70f6e51afed2fa7152bcd414c411286088fb4b9effb133885ab4cc", size = 214988 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/9a/77/0cc7a8a3bc7e53d07e8f47f147b92b0960e902b8254859f4aee5c4d7866b/semver-3.0.2-py3-none-any.whl", hash = "sha256:b1ea4686fe70b981f85359eda33199d60c53964284e0cfb4977d243e37cf4bf4", size = 17099 }, +] + +[[package]] +name = "shellingham" +version = "1.5.4" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/58/15/8b3609fd3830ef7b27b655beb4b4e9c62313a4e8da8c676e142cc210d58e/shellingham-1.5.4.tar.gz", hash = "sha256:8dbca0739d487e5bd35ab3ca4b36e11c4078f3a234bfce294b0a0291363404de", size = 10310 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e0/f9/0595336914c5619e5f28a1fb793285925a8cd4b432c9da0a987836c7f822/shellingham-1.5.4-py2.py3-none-any.whl", hash = "sha256:7ecfff8f2fd72616f7481040475a65b2bf8af90a56c89140852d1120324e8686", size = 9755 }, +] + +[[package]] +name = "smmap" +version = "5.0.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/88/04/b5bf6d21dc4041000ccba7eb17dd3055feb237e7ffc2c20d3fae3af62baa/smmap-5.0.1.tar.gz", hash = "sha256:dceeb6c0028fdb6734471eb07c0cd2aae706ccaecab45965ee83f11c8d3b1f62", size = 22291 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a7/a5/10f97f73544edcdef54409f1d839f6049a0d79df68adbc1ceb24d1aaca42/smmap-5.0.1-py3-none-any.whl", hash = "sha256:e6d8668fa5f93e706934a62d7b4db19c8d9eb8cf2adbb75ef1b675aa332b69da", size = 24282 }, +] + +[[package]] +name = "sniffio" +version = "1.3.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235 }, +] + +[[package]] +name = "tabulate" +version = "0.9.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ec/fe/802052aecb21e3797b8f7902564ab6ea0d60ff8ca23952079064155d1ae1/tabulate-0.9.0.tar.gz", hash = "sha256:0095b12bf5966de529c0feb1fa08671671b3368eec77d7ef7ab114be2c068b3c", size = 81090 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl", hash = "sha256:024ca478df22e9340661486f85298cff5f6dcdba14f3813e8830015b9ed1948f", size = 35252 }, +] + +[[package]] +name = "tomli" +version = "2.0.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/35/b9/de2a5c0144d7d75a57ff355c0c24054f965b2dc3036456ae03a51ea6264b/tomli-2.0.2.tar.gz", hash = "sha256:d46d457a85337051c36524bc5349dd91b1877838e2979ac5ced3e710ed8a60ed", size = 16096 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/cf/db/ce8eda256fa131af12e0a76d481711abe4681b6923c27efb9a255c9e4594/tomli-2.0.2-py3-none-any.whl", hash = "sha256:2ebe24485c53d303f690b0ec092806a085f07af5a5aa1464f3931eec36caaa38", size = 13237 }, +] + +[[package]] +name = "tomli-w" +version = "1.0.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/49/05/6bf21838623186b91aedbda06248ad18f03487dc56fbc20e4db384abde6c/tomli_w-1.0.0.tar.gz", hash = "sha256:f463434305e0336248cac9c2dc8076b707d8a12d019dd349f5c1e382dd1ae1b9", size = 6531 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/bb/01/1da9c66ecb20f31ed5aa5316a957e0b1a5e786a0d9689616ece4ceaf1321/tomli_w-1.0.0-py3-none-any.whl", hash = "sha256:9f2a07e8be30a0729e533ec968016807069991ae2fd921a78d42f429ae5f4463", size = 5984 }, +] + +[[package]] +name = "tomlkit" +version = "0.13.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b1/09/a439bec5888f00a54b8b9f05fa94d7f901d6735ef4e55dcec9bc37b5d8fa/tomlkit-0.13.2.tar.gz", hash = "sha256:fff5fe59a87295b278abd31bec92c15d9bc4a06885ab12bcea52c71119392e79", size = 192885 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/f9/b6/a447b5e4ec71e13871be01ba81f5dfc9d0af7e473da256ff46bc0e24026f/tomlkit-0.13.2-py3-none-any.whl", hash = "sha256:7a974427f6e119197f670fbbbeae7bef749a6c14e793db934baefc1b5f03efde", size = 37955 }, +] + +[[package]] +name = "trove-classifiers" +version = "2024.10.21.16" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/99/85/92c2667cf221b37648041ce9319427f92fa76cbec634aad844e67e284706/trove_classifiers-2024.10.21.16.tar.gz", hash = "sha256:17cbd055d67d5e9d9de63293a8732943fabc21574e4c7b74edf112b4928cf5f3", size = 16153 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/35/35/5055ab8d215af853d07bbff1a74edf48f91ed308f037380a5ca52dd73348/trove_classifiers-2024.10.21.16-py3-none-any.whl", hash = "sha256:0fb11f1e995a757807a8ef1c03829fbd4998d817319abcef1f33165750f103be", size = 13546 }, +] + +[[package]] +name = "twine" +version = "5.1.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "importlib-metadata" }, + { name = "keyring" }, + { name = "pkginfo" }, + { name = "readme-renderer" }, + { name = "requests" }, + { name = "requests-toolbelt" }, + { name = "rfc3986" }, + { name = "rich" }, + { name = "urllib3" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/77/68/bd982e5e949ef8334e6f7dcf76ae40922a8750aa2e347291ae1477a4782b/twine-5.1.1.tar.gz", hash = "sha256:9aa0825139c02b3434d913545c7b847a21c835e11597f5255842d457da2322db", size = 225531 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/5d/ec/00f9d5fd040ae29867355e559a94e9a8429225a0284a3f5f091a3878bfc0/twine-5.1.1-py3-none-any.whl", hash = "sha256:215dbe7b4b94c2c50a7315c0275d2258399280fbb7d04182c7e55e24b5f93997", size = 38650 }, +] + +[[package]] +name = "typing-extensions" +version = "4.12.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/df/db/f35a00659bc03fec321ba8bce9420de607a1d37f8342eee1863174c69557/typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8", size = 85321 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/26/9f/ad63fc0248c5379346306f8668cda6e2e2e9c95e01216d2b8ffd9ff037d0/typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d", size = 37438 }, +] + +[[package]] +name = "urllib3" +version = "2.2.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ed/63/22ba4ebfe7430b76388e7cd448d5478814d3032121827c12a2cc287e2260/urllib3-2.2.3.tar.gz", hash = "sha256:e7d814a81dad81e6caf2ec9fdedb284ecc9c73076b62654547cc64ccdcae26e9", size = 300677 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ce/d9/5f4c13cecde62396b0d3fe530a50ccea91e7dfc1ccf0e09c228841bb5ba8/urllib3-2.2.3-py3-none-any.whl", hash = "sha256:ca899ca043dcb1bafa3e262d73aa25c465bfb49e0bd9dd5d59f1d0acba2f8fac", size = 126338 }, +] + +[[package]] +name = "userpath" +version = "1.9.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "click" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/d5/b7/30753098208505d7ff9be5b3a32112fb8a4cb3ddfccbbb7ba9973f2e29ff/userpath-1.9.2.tar.gz", hash = "sha256:6c52288dab069257cc831846d15d48133522455d4677ee69a9781f11dbefd815", size = 11140 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/43/99/3ec6335ded5b88c2f7ed25c56ffd952546f7ed007ffb1e1539dc3b57015a/userpath-1.9.2-py3-none-any.whl", hash = "sha256:2cbf01a23d655a1ff8fc166dfb78da1b641d1ceabf0fe5f970767d380b14e89d", size = 9065 }, +] + +[[package]] +name = "virtualenv" +version = "20.27.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "distlib" }, + { name = "filelock" }, + { name = "platformdirs" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/8c/b3/7b6a79c5c8cf6d90ea681310e169cf2db2884f4d583d16c6e1d5a75a4e04/virtualenv-20.27.1.tar.gz", hash = "sha256:142c6be10212543b32c6c45d3d3893dff89112cc588b7d0879ae5a1ec03a47ba", size = 6491145 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ae/92/78324ff89391e00c8f4cf6b8526c41c6ef36b4ea2d2c132250b1a6fc2b8d/virtualenv-20.27.1-py3-none-any.whl", hash = "sha256:f11f1b8a29525562925f745563bfd48b189450f61fb34c4f9cc79dd5aa32a1f4", size = 3117838 }, +] + +[[package]] +name = "wrapt" +version = "1.16.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/95/4c/063a912e20bcef7124e0df97282a8af3ff3e4b603ce84c481d6d7346be0a/wrapt-1.16.0.tar.gz", hash = "sha256:5f370f952971e7d17c7d1ead40e49f32345a7f7a5373571ef44d800d06b1899d", size = 53972 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a8/c6/5375258add3777494671d8cec27cdf5402abd91016dee24aa2972c61fedf/wrapt-1.16.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:ffa565331890b90056c01db69c0fe634a776f8019c143a5ae265f9c6bc4bd6d4", size = 37315 }, + { url = "https://files.pythonhosted.org/packages/32/12/e11adfde33444986135d8881b401e4de6cbb4cced046edc6b464e6ad7547/wrapt-1.16.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:e4fdb9275308292e880dcbeb12546df7f3e0f96c6b41197e0cf37d2826359020", size = 38160 }, + { url = "https://files.pythonhosted.org/packages/70/7d/3dcc4a7e96f8d3e398450ec7703db384413f79bd6c0196e0e139055ce00f/wrapt-1.16.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bb2dee3874a500de01c93d5c71415fcaef1d858370d405824783e7a8ef5db440", size = 80419 }, + { url = "https://files.pythonhosted.org/packages/d1/c4/8dfdc3c2f0b38be85c8d9fdf0011ebad2f54e40897f9549a356bebb63a97/wrapt-1.16.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2a88e6010048489cda82b1326889ec075a8c856c2e6a256072b28eaee3ccf487", size = 72669 }, + { url = "https://files.pythonhosted.org/packages/49/83/b40bc1ad04a868b5b5bcec86349f06c1ee1ea7afe51dc3e46131e4f39308/wrapt-1.16.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ac83a914ebaf589b69f7d0a1277602ff494e21f4c2f743313414378f8f50a4cf", size = 80271 }, + { url = "https://files.pythonhosted.org/packages/19/d4/cd33d3a82df73a064c9b6401d14f346e1d2fb372885f0295516ec08ed2ee/wrapt-1.16.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:73aa7d98215d39b8455f103de64391cb79dfcad601701a3aa0dddacf74911d72", size = 84748 }, + { url = "https://files.pythonhosted.org/packages/ef/58/2fde309415b5fa98fd8f5f4a11886cbf276824c4c64d45a39da342fff6fe/wrapt-1.16.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:807cc8543a477ab7422f1120a217054f958a66ef7314f76dd9e77d3f02cdccd0", size = 77522 }, + { url = "https://files.pythonhosted.org/packages/07/44/359e4724a92369b88dbf09878a7cde7393cf3da885567ea898e5904049a3/wrapt-1.16.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:bf5703fdeb350e36885f2875d853ce13172ae281c56e509f4e6eca049bdfb136", size = 84780 }, + { url = "https://files.pythonhosted.org/packages/88/8f/706f2fee019360cc1da652353330350c76aa5746b4e191082e45d6838faf/wrapt-1.16.0-cp310-cp310-win32.whl", hash = "sha256:f6b2d0c6703c988d334f297aa5df18c45e97b0af3679bb75059e0e0bd8b1069d", size = 35335 }, + { url = "https://files.pythonhosted.org/packages/19/2b/548d23362e3002ebbfaefe649b833fa43f6ca37ac3e95472130c4b69e0b4/wrapt-1.16.0-cp310-cp310-win_amd64.whl", hash = "sha256:decbfa2f618fa8ed81c95ee18a387ff973143c656ef800c9f24fb7e9c16054e2", size = 37528 }, + { url = "https://files.pythonhosted.org/packages/fd/03/c188ac517f402775b90d6f312955a5e53b866c964b32119f2ed76315697e/wrapt-1.16.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:1a5db485fe2de4403f13fafdc231b0dbae5eca4359232d2efc79025527375b09", size = 37313 }, + { url = "https://files.pythonhosted.org/packages/0f/16/ea627d7817394db04518f62934a5de59874b587b792300991b3c347ff5e0/wrapt-1.16.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:75ea7d0ee2a15733684badb16de6794894ed9c55aa5e9903260922f0482e687d", size = 38164 }, + { url = "https://files.pythonhosted.org/packages/7f/a7/f1212ba098f3de0fd244e2de0f8791ad2539c03bef6c05a9fcb03e45b089/wrapt-1.16.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a452f9ca3e3267cd4d0fcf2edd0d035b1934ac2bd7e0e57ac91ad6b95c0c6389", size = 80890 }, + { url = "https://files.pythonhosted.org/packages/b7/96/bb5e08b3d6db003c9ab219c487714c13a237ee7dcc572a555eaf1ce7dc82/wrapt-1.16.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:43aa59eadec7890d9958748db829df269f0368521ba6dc68cc172d5d03ed8060", size = 73118 }, + { url = "https://files.pythonhosted.org/packages/6e/52/2da48b35193e39ac53cfb141467d9f259851522d0e8c87153f0ba4205fb1/wrapt-1.16.0-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:72554a23c78a8e7aa02abbd699d129eead8b147a23c56e08d08dfc29cfdddca1", size = 80746 }, + { url = "https://files.pythonhosted.org/packages/11/fb/18ec40265ab81c0e82a934de04596b6ce972c27ba2592c8b53d5585e6bcd/wrapt-1.16.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:d2efee35b4b0a347e0d99d28e884dfd82797852d62fcd7ebdeee26f3ceb72cf3", size = 85668 }, + { url = "https://files.pythonhosted.org/packages/0f/ef/0ecb1fa23145560431b970418dce575cfaec555ab08617d82eb92afc7ccf/wrapt-1.16.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:6dcfcffe73710be01d90cae08c3e548d90932d37b39ef83969ae135d36ef3956", size = 78556 }, + { url = "https://files.pythonhosted.org/packages/25/62/cd284b2b747f175b5a96cbd8092b32e7369edab0644c45784871528eb852/wrapt-1.16.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:eb6e651000a19c96f452c85132811d25e9264d836951022d6e81df2fff38337d", size = 85712 }, + { url = "https://files.pythonhosted.org/packages/e5/a7/47b7ff74fbadf81b696872d5ba504966591a3468f1bc86bca2f407baef68/wrapt-1.16.0-cp311-cp311-win32.whl", hash = "sha256:66027d667efe95cc4fa945af59f92c5a02c6f5bb6012bff9e60542c74c75c362", size = 35327 }, + { url = "https://files.pythonhosted.org/packages/cf/c3/0084351951d9579ae83a3d9e38c140371e4c6b038136909235079f2e6e78/wrapt-1.16.0-cp311-cp311-win_amd64.whl", hash = "sha256:aefbc4cb0a54f91af643660a0a150ce2c090d3652cf4052a5397fb2de549cd89", size = 37523 }, + { url = "https://files.pythonhosted.org/packages/92/17/224132494c1e23521868cdd57cd1e903f3b6a7ba6996b7b8f077ff8ac7fe/wrapt-1.16.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:5eb404d89131ec9b4f748fa5cfb5346802e5ee8836f57d516576e61f304f3b7b", size = 37614 }, + { url = "https://files.pythonhosted.org/packages/6a/d7/cfcd73e8f4858079ac59d9db1ec5a1349bc486ae8e9ba55698cc1f4a1dff/wrapt-1.16.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:9090c9e676d5236a6948330e83cb89969f433b1943a558968f659ead07cb3b36", size = 38316 }, + { url = "https://files.pythonhosted.org/packages/7e/79/5ff0a5c54bda5aec75b36453d06be4f83d5cd4932cc84b7cb2b52cee23e2/wrapt-1.16.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:94265b00870aa407bd0cbcfd536f17ecde43b94fb8d228560a1e9d3041462d73", size = 86322 }, + { url = "https://files.pythonhosted.org/packages/c4/81/e799bf5d419f422d8712108837c1d9bf6ebe3cb2a81ad94413449543a923/wrapt-1.16.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f2058f813d4f2b5e3a9eb2eb3faf8f1d99b81c3e51aeda4b168406443e8ba809", size = 79055 }, + { url = "https://files.pythonhosted.org/packages/62/62/30ca2405de6a20448ee557ab2cd61ab9c5900be7cbd18a2639db595f0b98/wrapt-1.16.0-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:98b5e1f498a8ca1858a1cdbffb023bfd954da4e3fa2c0cb5853d40014557248b", size = 87291 }, + { url = "https://files.pythonhosted.org/packages/49/4e/5d2f6d7b57fc9956bf06e944eb00463551f7d52fc73ca35cfc4c2cdb7aed/wrapt-1.16.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:14d7dc606219cdd7405133c713f2c218d4252f2a469003f8c46bb92d5d095d81", size = 90374 }, + { url = "https://files.pythonhosted.org/packages/a6/9b/c2c21b44ff5b9bf14a83252a8b973fb84923764ff63db3e6dfc3895cf2e0/wrapt-1.16.0-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:49aac49dc4782cb04f58986e81ea0b4768e4ff197b57324dcbd7699c5dfb40b9", size = 83896 }, + { url = "https://files.pythonhosted.org/packages/14/26/93a9fa02c6f257df54d7570dfe8011995138118d11939a4ecd82cb849613/wrapt-1.16.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:418abb18146475c310d7a6dc71143d6f7adec5b004ac9ce08dc7a34e2babdc5c", size = 91738 }, + { url = "https://files.pythonhosted.org/packages/a2/5b/4660897233eb2c8c4de3dc7cefed114c61bacb3c28327e64150dc44ee2f6/wrapt-1.16.0-cp312-cp312-win32.whl", hash = "sha256:685f568fa5e627e93f3b52fda002c7ed2fa1800b50ce51f6ed1d572d8ab3e7fc", size = 35568 }, + { url = "https://files.pythonhosted.org/packages/5c/cc/8297f9658506b224aa4bd71906447dea6bb0ba629861a758c28f67428b91/wrapt-1.16.0-cp312-cp312-win_amd64.whl", hash = "sha256:dcdba5c86e368442528f7060039eda390cc4091bfd1dca41e8046af7c910dda8", size = 37653 }, + { url = "https://files.pythonhosted.org/packages/fe/9e/d3bc95e75670ba15c5b25ecf07fc49941843e2678d777ca59339348d1c96/wrapt-1.16.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1dd50a2696ff89f57bd8847647a1c363b687d3d796dc30d4dd4a9d1689a706f0", size = 37320 }, + { url = "https://files.pythonhosted.org/packages/72/b5/0c9be75f826c8e8d583a4ab312552d63d9f7c0768710146a22ac59bda4a9/wrapt-1.16.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:44a2754372e32ab315734c6c73b24351d06e77ffff6ae27d2ecf14cf3d229202", size = 38163 }, + { url = "https://files.pythonhosted.org/packages/69/21/b2ba809bafc9b6265e359f9c259c6d9a52a16cf6be20c72d95e76da609dd/wrapt-1.16.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8e9723528b9f787dc59168369e42ae1c3b0d3fadb2f1a71de14531d321ee05b0", size = 83535 }, + { url = "https://files.pythonhosted.org/packages/58/43/d72e625edb5926483c9868214d25b5e7d5858ace6a80c9dfddfbadf4d8f9/wrapt-1.16.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:dbed418ba5c3dce92619656802cc5355cb679e58d0d89b50f116e4a9d5a9603e", size = 75975 }, + { url = "https://files.pythonhosted.org/packages/ef/c6/56e718e2c58a4078518c14d97e531ef1e9e8a5c1ddafdc0d264a92be1a1a/wrapt-1.16.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:941988b89b4fd6b41c3f0bfb20e92bd23746579736b7343283297c4c8cbae68f", size = 83363 }, + { url = "https://files.pythonhosted.org/packages/34/49/589db6fa2d5d428b71716815bca8b39196fdaeea7c247a719ed2f93b0ab4/wrapt-1.16.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:6a42cd0cfa8ffc1915aef79cb4284f6383d8a3e9dcca70c445dcfdd639d51267", size = 87739 }, + { url = "https://files.pythonhosted.org/packages/c5/40/3eabe06c8dc54fada7364f34e8caa562efe3bf3f769bf3258de9c785a27f/wrapt-1.16.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:1ca9b6085e4f866bd584fb135a041bfc32cab916e69f714a7d1d397f8c4891ca", size = 80700 }, + { url = "https://files.pythonhosted.org/packages/15/4e/081f59237b620a124b035f1229f55db40841a9339fdb8ef60b4decc44df9/wrapt-1.16.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:d5e49454f19ef621089e204f862388d29e6e8d8b162efce05208913dde5b9ad6", size = 87783 }, + { url = "https://files.pythonhosted.org/packages/3a/ad/9d26a33bc80444ff97b937f94611f3b986fd40f735823558dfdf05ef9db8/wrapt-1.16.0-cp38-cp38-win32.whl", hash = "sha256:c31f72b1b6624c9d863fc095da460802f43a7c6868c5dda140f51da24fd47d7b", size = 35332 }, + { url = "https://files.pythonhosted.org/packages/01/db/4b29ba5f97d2a0aa97ec41eba1036b7c3eaf6e61e1f4639420cec2463a01/wrapt-1.16.0-cp38-cp38-win_amd64.whl", hash = "sha256:490b0ee15c1a55be9c1bd8609b8cecd60e325f0575fc98f50058eae366e01f41", size = 37524 }, + { url = "https://files.pythonhosted.org/packages/70/cc/b92e1da2cad6a9f8ee481000ece07a35e3b24e041e60ff8b850c079f0ebf/wrapt-1.16.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9b201ae332c3637a42f02d1045e1d0cccfdc41f1f2f801dafbaa7e9b4797bfc2", size = 37314 }, + { url = "https://files.pythonhosted.org/packages/4a/cc/3402bcc897978be00fef608cd9e3e39ec8869c973feeb5e1e277670e5ad2/wrapt-1.16.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:2076fad65c6736184e77d7d4729b63a6d1ae0b70da4868adeec40989858eb3fb", size = 38162 }, + { url = "https://files.pythonhosted.org/packages/28/d3/4f079f649c515727c127c987b2ec2e0816b80d95784f2d28d1a57d2a1029/wrapt-1.16.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c5cd603b575ebceca7da5a3a251e69561bec509e0b46e4993e1cac402b7247b8", size = 80235 }, + { url = "https://files.pythonhosted.org/packages/a3/1c/226c2a4932e578a2241dcb383f425995f80224b446f439c2e112eb51c3a6/wrapt-1.16.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b47cfad9e9bbbed2339081f4e346c93ecd7ab504299403320bf85f7f85c7d46c", size = 72553 }, + { url = "https://files.pythonhosted.org/packages/b1/e7/459a8a4f40f2fa65eb73cb3f339e6d152957932516d18d0e996c7ae2d7ae/wrapt-1.16.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f8212564d49c50eb4565e502814f694e240c55551a5f1bc841d4fcaabb0a9b8a", size = 80129 }, + { url = "https://files.pythonhosted.org/packages/da/6f/6d0b3c4983f1fc764a422989dabc268ee87d937763246cd48aa92f1eed1e/wrapt-1.16.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:5f15814a33e42b04e3de432e573aa557f9f0f56458745c2074952f564c50e664", size = 84550 }, + { url = "https://files.pythonhosted.org/packages/96/e8/27ef35cf61e5147c1c3abcb89cfbb8d691b2bb8364803fcc950140bc14d8/wrapt-1.16.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:db2e408d983b0e61e238cf579c09ef7020560441906ca990fe8412153e3b291f", size = 77352 }, + { url = "https://files.pythonhosted.org/packages/b6/ad/7a0766341081bfd9f18a7049e4d6d45586ae5c5bb0a640f05e2f558e849c/wrapt-1.16.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:edfad1d29c73f9b863ebe7082ae9321374ccb10879eeabc84ba3b69f2579d537", size = 84626 }, + { url = "https://files.pythonhosted.org/packages/09/43/b26852e9c45a1aac0d14b1080b25b612fa840ba99739c5fc55db07b7ce08/wrapt-1.16.0-cp39-cp39-win32.whl", hash = "sha256:ed867c42c268f876097248e05b6117a65bcd1e63b779e916fe2e33cd6fd0d3c3", size = 35327 }, + { url = "https://files.pythonhosted.org/packages/74/f2/96ed140b08743f7f68d5bda35a2a589600781366c3da96f056043d258b1a/wrapt-1.16.0-cp39-cp39-win_amd64.whl", hash = "sha256:eb1b046be06b0fce7249f1d025cd359b4b80fc1c3e24ad9eca33e0dcdb2e4a35", size = 37526 }, + { url = "https://files.pythonhosted.org/packages/ff/21/abdedb4cdf6ff41ebf01a74087740a709e2edb146490e4d9beea054b0b7a/wrapt-1.16.0-py3-none-any.whl", hash = "sha256:6906c4100a8fcbf2fa735f6059214bb13b97f75b1a61777fcf6432121ef12ef1", size = 23362 }, +] + +[[package]] +name = "zipp" +version = "3.20.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/54/bf/5c0000c44ebc80123ecbdddba1f5dcd94a5ada602a9c225d84b5aaa55e86/zipp-3.20.2.tar.gz", hash = "sha256:bc9eb26f4506fda01b81bcde0ca78103b6e62f991b381fec825435c836edbc29", size = 24199 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/62/8b/5ba542fa83c90e09eac972fc9baca7a88e7e7ca4b221a89251954019308b/zipp-3.20.2-py3-none-any.whl", hash = "sha256:a817ac80d6cf4b23bf7f2828b7cabf326f15a001bea8b1f9b49631780ba28350", size = 9200 }, +] + +[[package]] +name = "zstandard" +version = "0.23.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "cffi", marker = "platform_python_implementation == 'PyPy'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/ed/f6/2ac0287b442160a89d726b17a9184a4c615bb5237db763791a7fd16d9df1/zstandard-0.23.0.tar.gz", hash = "sha256:b2d8c62d08e7255f68f7a740bae85b3c9b8e5466baa9cbf7f57f1cde0ac6bc09", size = 681701 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2a/55/bd0487e86679db1823fc9ee0d8c9c78ae2413d34c0b461193b5f4c31d22f/zstandard-0.23.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:bf0a05b6059c0528477fba9054d09179beb63744355cab9f38059548fedd46a9", size = 788701 }, + { url = "https://files.pythonhosted.org/packages/e1/8a/ccb516b684f3ad987dfee27570d635822e3038645b1a950c5e8022df1145/zstandard-0.23.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:fc9ca1c9718cb3b06634c7c8dec57d24e9438b2aa9a0f02b8bb36bf478538880", size = 633678 }, + { url = "https://files.pythonhosted.org/packages/12/89/75e633d0611c028e0d9af6df199423bf43f54bea5007e6718ab7132e234c/zstandard-0.23.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:77da4c6bfa20dd5ea25cbf12c76f181a8e8cd7ea231c673828d0386b1740b8dc", size = 4941098 }, + { url = "https://files.pythonhosted.org/packages/4a/7a/bd7f6a21802de358b63f1ee636ab823711c25ce043a3e9f043b4fcb5ba32/zstandard-0.23.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b2170c7e0367dde86a2647ed5b6f57394ea7f53545746104c6b09fc1f4223573", size = 5308798 }, + { url = "https://files.pythonhosted.org/packages/79/3b/775f851a4a65013e88ca559c8ae42ac1352db6fcd96b028d0df4d7d1d7b4/zstandard-0.23.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c16842b846a8d2a145223f520b7e18b57c8f476924bda92aeee3a88d11cfc391", size = 5341840 }, + { url = "https://files.pythonhosted.org/packages/09/4f/0cc49570141dd72d4d95dd6fcf09328d1b702c47a6ec12fbed3b8aed18a5/zstandard-0.23.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:157e89ceb4054029a289fb504c98c6a9fe8010f1680de0201b3eb5dc20aa6d9e", size = 5440337 }, + { url = "https://files.pythonhosted.org/packages/e7/7c/aaa7cd27148bae2dc095191529c0570d16058c54c4597a7d118de4b21676/zstandard-0.23.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:203d236f4c94cd8379d1ea61db2fce20730b4c38d7f1c34506a31b34edc87bdd", size = 4861182 }, + { url = "https://files.pythonhosted.org/packages/ac/eb/4b58b5c071d177f7dc027129d20bd2a44161faca6592a67f8fcb0b88b3ae/zstandard-0.23.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:dc5d1a49d3f8262be192589a4b72f0d03b72dcf46c51ad5852a4fdc67be7b9e4", size = 4932936 }, + { url = "https://files.pythonhosted.org/packages/44/f9/21a5fb9bb7c9a274b05ad700a82ad22ce82f7ef0f485980a1e98ed6e8c5f/zstandard-0.23.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:752bf8a74412b9892f4e5b58f2f890a039f57037f52c89a740757ebd807f33ea", size = 5464705 }, + { url = "https://files.pythonhosted.org/packages/49/74/b7b3e61db3f88632776b78b1db597af3f44c91ce17d533e14a25ce6a2816/zstandard-0.23.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:80080816b4f52a9d886e67f1f96912891074903238fe54f2de8b786f86baded2", size = 4857882 }, + { url = "https://files.pythonhosted.org/packages/4a/7f/d8eb1cb123d8e4c541d4465167080bec88481ab54cd0b31eb4013ba04b95/zstandard-0.23.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:84433dddea68571a6d6bd4fbf8ff398236031149116a7fff6f777ff95cad3df9", size = 4697672 }, + { url = "https://files.pythonhosted.org/packages/5e/05/f7dccdf3d121309b60342da454d3e706453a31073e2c4dac8e1581861e44/zstandard-0.23.0-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:ab19a2d91963ed9e42b4e8d77cd847ae8381576585bad79dbd0a8837a9f6620a", size = 5206043 }, + { url = "https://files.pythonhosted.org/packages/86/9d/3677a02e172dccd8dd3a941307621c0cbd7691d77cb435ac3c75ab6a3105/zstandard-0.23.0-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:59556bf80a7094d0cfb9f5e50bb2db27fefb75d5138bb16fb052b61b0e0eeeb0", size = 5667390 }, + { url = "https://files.pythonhosted.org/packages/41/7e/0012a02458e74a7ba122cd9cafe491facc602c9a17f590367da369929498/zstandard-0.23.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:27d3ef2252d2e62476389ca8f9b0cf2bbafb082a3b6bfe9d90cbcbb5529ecf7c", size = 5198901 }, + { url = "https://files.pythonhosted.org/packages/65/3a/8f715b97bd7bcfc7342d8adcd99a026cb2fb550e44866a3b6c348e1b0f02/zstandard-0.23.0-cp310-cp310-win32.whl", hash = "sha256:5d41d5e025f1e0bccae4928981e71b2334c60f580bdc8345f824e7c0a4c2a813", size = 430596 }, + { url = "https://files.pythonhosted.org/packages/19/b7/b2b9eca5e5a01111e4fe8a8ffb56bdcdf56b12448a24effe6cfe4a252034/zstandard-0.23.0-cp310-cp310-win_amd64.whl", hash = "sha256:519fbf169dfac1222a76ba8861ef4ac7f0530c35dd79ba5727014613f91613d4", size = 495498 }, + { url = "https://files.pythonhosted.org/packages/9e/40/f67e7d2c25a0e2dc1744dd781110b0b60306657f8696cafb7ad7579469bd/zstandard-0.23.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:34895a41273ad33347b2fc70e1bff4240556de3c46c6ea430a7ed91f9042aa4e", size = 788699 }, + { url = "https://files.pythonhosted.org/packages/e8/46/66d5b55f4d737dd6ab75851b224abf0afe5774976fe511a54d2eb9063a41/zstandard-0.23.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:77ea385f7dd5b5676d7fd943292ffa18fbf5c72ba98f7d09fc1fb9e819b34c23", size = 633681 }, + { url = "https://files.pythonhosted.org/packages/63/b6/677e65c095d8e12b66b8f862b069bcf1f1d781b9c9c6f12eb55000d57583/zstandard-0.23.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:983b6efd649723474f29ed42e1467f90a35a74793437d0bc64a5bf482bedfa0a", size = 4944328 }, + { url = "https://files.pythonhosted.org/packages/59/cc/e76acb4c42afa05a9d20827116d1f9287e9c32b7ad58cc3af0721ce2b481/zstandard-0.23.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:80a539906390591dd39ebb8d773771dc4db82ace6372c4d41e2d293f8e32b8db", size = 5311955 }, + { url = "https://files.pythonhosted.org/packages/78/e4/644b8075f18fc7f632130c32e8f36f6dc1b93065bf2dd87f03223b187f26/zstandard-0.23.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:445e4cb5048b04e90ce96a79b4b63140e3f4ab5f662321975679b5f6360b90e2", size = 5344944 }, + { url = "https://files.pythonhosted.org/packages/76/3f/dbafccf19cfeca25bbabf6f2dd81796b7218f768ec400f043edc767015a6/zstandard-0.23.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fd30d9c67d13d891f2360b2a120186729c111238ac63b43dbd37a5a40670b8ca", size = 5442927 }, + { url = "https://files.pythonhosted.org/packages/0c/c3/d24a01a19b6733b9f218e94d1a87c477d523237e07f94899e1c10f6fd06c/zstandard-0.23.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d20fd853fbb5807c8e84c136c278827b6167ded66c72ec6f9a14b863d809211c", size = 4864910 }, + { url = "https://files.pythonhosted.org/packages/1c/a9/cf8f78ead4597264f7618d0875be01f9bc23c9d1d11afb6d225b867cb423/zstandard-0.23.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:ed1708dbf4d2e3a1c5c69110ba2b4eb6678262028afd6c6fbcc5a8dac9cda68e", size = 4935544 }, + { url = "https://files.pythonhosted.org/packages/2c/96/8af1e3731b67965fb995a940c04a2c20997a7b3b14826b9d1301cf160879/zstandard-0.23.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:be9b5b8659dff1f913039c2feee1aca499cfbc19e98fa12bc85e037c17ec6ca5", size = 5467094 }, + { url = "https://files.pythonhosted.org/packages/ff/57/43ea9df642c636cb79f88a13ab07d92d88d3bfe3e550b55a25a07a26d878/zstandard-0.23.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:65308f4b4890aa12d9b6ad9f2844b7ee42c7f7a4fd3390425b242ffc57498f48", size = 4860440 }, + { url = "https://files.pythonhosted.org/packages/46/37/edb78f33c7f44f806525f27baa300341918fd4c4af9472fbc2c3094be2e8/zstandard-0.23.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:98da17ce9cbf3bfe4617e836d561e433f871129e3a7ac16d6ef4c680f13a839c", size = 4700091 }, + { url = "https://files.pythonhosted.org/packages/c1/f1/454ac3962671a754f3cb49242472df5c2cced4eb959ae203a377b45b1a3c/zstandard-0.23.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:8ed7d27cb56b3e058d3cf684d7200703bcae623e1dcc06ed1e18ecda39fee003", size = 5208682 }, + { url = "https://files.pythonhosted.org/packages/85/b2/1734b0fff1634390b1b887202d557d2dd542de84a4c155c258cf75da4773/zstandard-0.23.0-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:b69bb4f51daf461b15e7b3db033160937d3ff88303a7bc808c67bbc1eaf98c78", size = 5669707 }, + { url = "https://files.pythonhosted.org/packages/52/5a/87d6971f0997c4b9b09c495bf92189fb63de86a83cadc4977dc19735f652/zstandard-0.23.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:034b88913ecc1b097f528e42b539453fa82c3557e414b3de9d5632c80439a473", size = 5201792 }, + { url = "https://files.pythonhosted.org/packages/79/02/6f6a42cc84459d399bd1a4e1adfc78d4dfe45e56d05b072008d10040e13b/zstandard-0.23.0-cp311-cp311-win32.whl", hash = "sha256:f2d4380bf5f62daabd7b751ea2339c1a21d1c9463f1feb7fc2bdcea2c29c3160", size = 430586 }, + { url = "https://files.pythonhosted.org/packages/be/a2/4272175d47c623ff78196f3c10e9dc7045c1b9caf3735bf041e65271eca4/zstandard-0.23.0-cp311-cp311-win_amd64.whl", hash = "sha256:62136da96a973bd2557f06ddd4e8e807f9e13cbb0bfb9cc06cfe6d98ea90dfe0", size = 495420 }, + { url = "https://files.pythonhosted.org/packages/7b/83/f23338c963bd9de687d47bf32efe9fd30164e722ba27fb59df33e6b1719b/zstandard-0.23.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:b4567955a6bc1b20e9c31612e615af6b53733491aeaa19a6b3b37f3b65477094", size = 788713 }, + { url = "https://files.pythonhosted.org/packages/5b/b3/1a028f6750fd9227ee0b937a278a434ab7f7fdc3066c3173f64366fe2466/zstandard-0.23.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1e172f57cd78c20f13a3415cc8dfe24bf388614324d25539146594c16d78fcc8", size = 633459 }, + { url = "https://files.pythonhosted.org/packages/26/af/36d89aae0c1f95a0a98e50711bc5d92c144939efc1f81a2fcd3e78d7f4c1/zstandard-0.23.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b0e166f698c5a3e914947388c162be2583e0c638a4703fc6a543e23a88dea3c1", size = 4945707 }, + { url = "https://files.pythonhosted.org/packages/cd/2e/2051f5c772f4dfc0aae3741d5fc72c3dcfe3aaeb461cc231668a4db1ce14/zstandard-0.23.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:12a289832e520c6bd4dcaad68e944b86da3bad0d339ef7989fb7e88f92e96072", size = 5306545 }, + { url = "https://files.pythonhosted.org/packages/0a/9e/a11c97b087f89cab030fa71206963090d2fecd8eb83e67bb8f3ffb84c024/zstandard-0.23.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d50d31bfedd53a928fed6707b15a8dbeef011bb6366297cc435accc888b27c20", size = 5337533 }, + { url = "https://files.pythonhosted.org/packages/fc/79/edeb217c57fe1bf16d890aa91a1c2c96b28c07b46afed54a5dcf310c3f6f/zstandard-0.23.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:72c68dda124a1a138340fb62fa21b9bf4848437d9ca60bd35db36f2d3345f373", size = 5436510 }, + { url = "https://files.pythonhosted.org/packages/81/4f/c21383d97cb7a422ddf1ae824b53ce4b51063d0eeb2afa757eb40804a8ef/zstandard-0.23.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:53dd9d5e3d29f95acd5de6802e909ada8d8d8cfa37a3ac64836f3bc4bc5512db", size = 4859973 }, + { url = "https://files.pythonhosted.org/packages/ab/15/08d22e87753304405ccac8be2493a495f529edd81d39a0870621462276ef/zstandard-0.23.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:6a41c120c3dbc0d81a8e8adc73312d668cd34acd7725f036992b1b72d22c1772", size = 4936968 }, + { url = "https://files.pythonhosted.org/packages/eb/fa/f3670a597949fe7dcf38119a39f7da49a8a84a6f0b1a2e46b2f71a0ab83f/zstandard-0.23.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:40b33d93c6eddf02d2c19f5773196068d875c41ca25730e8288e9b672897c105", size = 5467179 }, + { url = "https://files.pythonhosted.org/packages/4e/a9/dad2ab22020211e380adc477a1dbf9f109b1f8d94c614944843e20dc2a99/zstandard-0.23.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:9206649ec587e6b02bd124fb7799b86cddec350f6f6c14bc82a2b70183e708ba", size = 4848577 }, + { url = "https://files.pythonhosted.org/packages/08/03/dd28b4484b0770f1e23478413e01bee476ae8227bbc81561f9c329e12564/zstandard-0.23.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:76e79bc28a65f467e0409098fa2c4376931fd3207fbeb6b956c7c476d53746dd", size = 4693899 }, + { url = "https://files.pythonhosted.org/packages/2b/64/3da7497eb635d025841e958bcd66a86117ae320c3b14b0ae86e9e8627518/zstandard-0.23.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:66b689c107857eceabf2cf3d3fc699c3c0fe8ccd18df2219d978c0283e4c508a", size = 5199964 }, + { url = "https://files.pythonhosted.org/packages/43/a4/d82decbab158a0e8a6ebb7fc98bc4d903266bce85b6e9aaedea1d288338c/zstandard-0.23.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:9c236e635582742fee16603042553d276cca506e824fa2e6489db04039521e90", size = 5655398 }, + { url = "https://files.pythonhosted.org/packages/f2/61/ac78a1263bc83a5cf29e7458b77a568eda5a8f81980691bbc6eb6a0d45cc/zstandard-0.23.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:a8fffdbd9d1408006baaf02f1068d7dd1f016c6bcb7538682622c556e7b68e35", size = 5191313 }, + { url = "https://files.pythonhosted.org/packages/e7/54/967c478314e16af5baf849b6ee9d6ea724ae5b100eb506011f045d3d4e16/zstandard-0.23.0-cp312-cp312-win32.whl", hash = "sha256:dc1d33abb8a0d754ea4763bad944fd965d3d95b5baef6b121c0c9013eaf1907d", size = 430877 }, + { url = "https://files.pythonhosted.org/packages/75/37/872d74bd7739639c4553bf94c84af7d54d8211b626b352bc57f0fd8d1e3f/zstandard-0.23.0-cp312-cp312-win_amd64.whl", hash = "sha256:64585e1dba664dc67c7cdabd56c1e5685233fbb1fc1966cfba2a340ec0dfff7b", size = 495595 }, + { url = "https://files.pythonhosted.org/packages/80/f1/8386f3f7c10261fe85fbc2c012fdb3d4db793b921c9abcc995d8da1b7a80/zstandard-0.23.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:576856e8594e6649aee06ddbfc738fec6a834f7c85bf7cadd1c53d4a58186ef9", size = 788975 }, + { url = "https://files.pythonhosted.org/packages/16/e8/cbf01077550b3e5dc86089035ff8f6fbbb312bc0983757c2d1117ebba242/zstandard-0.23.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:38302b78a850ff82656beaddeb0bb989a0322a8bbb1bf1ab10c17506681d772a", size = 633448 }, + { url = "https://files.pythonhosted.org/packages/06/27/4a1b4c267c29a464a161aeb2589aff212b4db653a1d96bffe3598f3f0d22/zstandard-0.23.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d2240ddc86b74966c34554c49d00eaafa8200a18d3a5b6ffbf7da63b11d74ee2", size = 4945269 }, + { url = "https://files.pythonhosted.org/packages/7c/64/d99261cc57afd9ae65b707e38045ed8269fbdae73544fd2e4a4d50d0ed83/zstandard-0.23.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2ef230a8fd217a2015bc91b74f6b3b7d6522ba48be29ad4ea0ca3a3775bf7dd5", size = 5306228 }, + { url = "https://files.pythonhosted.org/packages/7a/cf/27b74c6f22541f0263016a0fd6369b1b7818941de639215c84e4e94b2a1c/zstandard-0.23.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:774d45b1fac1461f48698a9d4b5fa19a69d47ece02fa469825b442263f04021f", size = 5336891 }, + { url = "https://files.pythonhosted.org/packages/fa/18/89ac62eac46b69948bf35fcd90d37103f38722968e2981f752d69081ec4d/zstandard-0.23.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6f77fa49079891a4aab203d0b1744acc85577ed16d767b52fc089d83faf8d8ed", size = 5436310 }, + { url = "https://files.pythonhosted.org/packages/a8/a8/5ca5328ee568a873f5118d5b5f70d1f36c6387716efe2e369010289a5738/zstandard-0.23.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ac184f87ff521f4840e6ea0b10c0ec90c6b1dcd0bad2f1e4a9a1b4fa177982ea", size = 4859912 }, + { url = "https://files.pythonhosted.org/packages/ea/ca/3781059c95fd0868658b1cf0440edd832b942f84ae60685d0cfdb808bca1/zstandard-0.23.0-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:c363b53e257246a954ebc7c488304b5592b9c53fbe74d03bc1c64dda153fb847", size = 4936946 }, + { url = "https://files.pythonhosted.org/packages/ce/11/41a58986f809532742c2b832c53b74ba0e0a5dae7e8ab4642bf5876f35de/zstandard-0.23.0-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:e7792606d606c8df5277c32ccb58f29b9b8603bf83b48639b7aedf6df4fe8171", size = 5466994 }, + { url = "https://files.pythonhosted.org/packages/83/e3/97d84fe95edd38d7053af05159465d298c8b20cebe9ccb3d26783faa9094/zstandard-0.23.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a0817825b900fcd43ac5d05b8b3079937073d2b1ff9cf89427590718b70dd840", size = 4848681 }, + { url = "https://files.pythonhosted.org/packages/6e/99/cb1e63e931de15c88af26085e3f2d9af9ce53ccafac73b6e48418fd5a6e6/zstandard-0.23.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:9da6bc32faac9a293ddfdcb9108d4b20416219461e4ec64dfea8383cac186690", size = 4694239 }, + { url = "https://files.pythonhosted.org/packages/ab/50/b1e703016eebbc6501fc92f34db7b1c68e54e567ef39e6e59cf5fb6f2ec0/zstandard-0.23.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:fd7699e8fd9969f455ef2926221e0233f81a2542921471382e77a9e2f2b57f4b", size = 5200149 }, + { url = "https://files.pythonhosted.org/packages/aa/e0/932388630aaba70197c78bdb10cce2c91fae01a7e553b76ce85471aec690/zstandard-0.23.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:d477ed829077cd945b01fc3115edd132c47e6540ddcd96ca169facff28173057", size = 5655392 }, + { url = "https://files.pythonhosted.org/packages/02/90/2633473864f67a15526324b007a9f96c96f56d5f32ef2a56cc12f9548723/zstandard-0.23.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:fa6ce8b52c5987b3e34d5674b0ab529a4602b632ebab0a93b07bfb4dfc8f8a33", size = 5191299 }, + { url = "https://files.pythonhosted.org/packages/b0/4c/315ca5c32da7e2dc3455f3b2caee5c8c2246074a61aac6ec3378a97b7136/zstandard-0.23.0-cp313-cp313-win32.whl", hash = "sha256:a9b07268d0c3ca5c170a385a0ab9fb7fdd9f5fd866be004c4ea39e44edce47dd", size = 430862 }, + { url = "https://files.pythonhosted.org/packages/a2/bf/c6aaba098e2d04781e8f4f7c0ba3c7aa73d00e4c436bcc0cf059a66691d1/zstandard-0.23.0-cp313-cp313-win_amd64.whl", hash = "sha256:f3513916e8c645d0610815c257cbfd3242adfd5c4cfa78be514e5a3ebb42a41b", size = 495578 }, + { url = "https://files.pythonhosted.org/packages/fb/96/867dd4f5e9ee6215f83985c43f4134b28c058617a7af8ad9592669f960dd/zstandard-0.23.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2ef3775758346d9ac6214123887d25c7061c92afe1f2b354f9388e9e4d48acfc", size = 788685 }, + { url = "https://files.pythonhosted.org/packages/19/57/e81579db7740757036e97dc461f4f26a318fe8dfc6b3477dd557b7f85aae/zstandard-0.23.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:4051e406288b8cdbb993798b9a45c59a4896b6ecee2f875424ec10276a895740", size = 633665 }, + { url = "https://files.pythonhosted.org/packages/ac/a5/b8c9d79511796684a2a653843e0464dfcc11a052abb5855af7035d919ecc/zstandard-0.23.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e2d1a054f8f0a191004675755448d12be47fa9bebbcffa3cdf01db19f2d30a54", size = 4944817 }, + { url = "https://files.pythonhosted.org/packages/fa/59/ee5a3c4f060c431d3aaa7ff2b435d9723c579bffda274d071c981bf08b17/zstandard-0.23.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f83fa6cae3fff8e98691248c9320356971b59678a17f20656a9e59cd32cee6d8", size = 5311485 }, + { url = "https://files.pythonhosted.org/packages/8a/70/ea438a09d757d49c5bb73a895c13492277b83981c08ed294441b1965eaf2/zstandard-0.23.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:32ba3b5ccde2d581b1e6aa952c836a6291e8435d788f656fe5976445865ae045", size = 5340843 }, + { url = "https://files.pythonhosted.org/packages/1c/4b/be9f3f9ed33ff4d5e578cf167c16ac1d8542232d5e4831c49b615b5918a6/zstandard-0.23.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2f146f50723defec2975fb7e388ae3a024eb7151542d1599527ec2aa9cacb152", size = 5442446 }, + { url = "https://files.pythonhosted.org/packages/ef/17/55eff9df9004e1896f2ade19981e7cd24d06b463fe72f9a61f112b8185d0/zstandard-0.23.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1bfe8de1da6d104f15a60d4a8a768288f66aa953bbe00d027398b93fb9680b26", size = 4863800 }, + { url = "https://files.pythonhosted.org/packages/59/8c/fe542982e63e1948066bf2adc18e902196eb08f3407188474b5a4e855e2e/zstandard-0.23.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:29a2bc7c1b09b0af938b7a8343174b987ae021705acabcbae560166567f5a8db", size = 4935488 }, + { url = "https://files.pythonhosted.org/packages/38/6c/a54e30864aff0cc065c053fbdb581114328f70f45f30fcb0f80b12bb4460/zstandard-0.23.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:61f89436cbfede4bc4e91b4397eaa3e2108ebe96d05e93d6ccc95ab5714be512", size = 5467670 }, + { url = "https://files.pythonhosted.org/packages/ba/11/32788cc80aa8c1069a9fdc48a60355bd25ac8211b2414dd0ff6ee6bb5ff5/zstandard-0.23.0-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:53ea7cdc96c6eb56e76bb06894bcfb5dfa93b7adcf59d61c6b92674e24e2dd5e", size = 4859904 }, + { url = "https://files.pythonhosted.org/packages/60/93/baf7ad86b2258c08c06bdccdaddeb3d6d0918601e16fa9c73c8079c8c816/zstandard-0.23.0-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:a4ae99c57668ca1e78597d8b06d5af837f377f340f4cce993b551b2d7731778d", size = 4700723 }, + { url = "https://files.pythonhosted.org/packages/95/bd/e65f1c1e0185ed0c7f5bda51b0d73fc379a75f5dc2583aac83dd131378dc/zstandard-0.23.0-cp38-cp38-musllinux_1_2_ppc64le.whl", hash = "sha256:379b378ae694ba78cef921581ebd420c938936a153ded602c4fea612b7eaa90d", size = 5208667 }, + { url = "https://files.pythonhosted.org/packages/dc/cf/2dfa4610829c6c1dbc3ce858caed6de13928bec78c1e4d0bedfd4b20589b/zstandard-0.23.0-cp38-cp38-musllinux_1_2_s390x.whl", hash = "sha256:50a80baba0285386f97ea36239855f6020ce452456605f262b2d33ac35c7770b", size = 5667083 }, + { url = "https://files.pythonhosted.org/packages/16/f6/d84d95984fb9c8f57747ffeff66677f0a58acf430f9ddff84bc3b9aad35d/zstandard-0.23.0-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:61062387ad820c654b6a6b5f0b94484fa19515e0c5116faf29f41a6bc91ded6e", size = 5195874 }, + { url = "https://files.pythonhosted.org/packages/fc/a6/239f43f2e3ea0360c5641c075bd587c7f2a32b29d9ba53a538435621bcbb/zstandard-0.23.0-cp38-cp38-win32.whl", hash = "sha256:b8c0bd73aeac689beacd4e7667d48c299f61b959475cdbb91e7d3d88d27c56b9", size = 430654 }, + { url = "https://files.pythonhosted.org/packages/d5/b6/16e737301831c9c62379ed466c3d916c56b8a9a95fbce9bf1d7fea318945/zstandard-0.23.0-cp38-cp38-win_amd64.whl", hash = "sha256:a05e6d6218461eb1b4771d973728f0133b2a4613a6779995df557f70794fd60f", size = 495519 }, + { url = "https://files.pythonhosted.org/packages/fb/96/4fcafeb7e013a2386d22f974b5b97a0b9a65004ed58c87ae001599bfbd48/zstandard-0.23.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:3aa014d55c3af933c1315eb4bb06dd0459661cc0b15cd61077afa6489bec63bb", size = 788697 }, + { url = "https://files.pythonhosted.org/packages/83/ff/a52ce725be69b86a2967ecba0497a8184540cc284c0991125515449e54e2/zstandard-0.23.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:0a7f0804bb3799414af278e9ad51be25edf67f78f916e08afdb983e74161b916", size = 633679 }, + { url = "https://files.pythonhosted.org/packages/34/0f/3dc62db122f6a9c481c335fff6fc9f4e88d8f6e2d47321ee3937328addb4/zstandard-0.23.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb2b1ecfef1e67897d336de3a0e3f52478182d6a47eda86cbd42504c5cbd009a", size = 4940416 }, + { url = "https://files.pythonhosted.org/packages/1d/e5/9fe0dd8c85fdc2f635e6660d07872a5dc4b366db566630161e39f9f804e1/zstandard-0.23.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:837bb6764be6919963ef41235fd56a6486b132ea64afe5fafb4cb279ac44f259", size = 5307693 }, + { url = "https://files.pythonhosted.org/packages/73/bf/fe62c0cd865c171ee8ed5bc83174b5382a2cb729c8d6162edfb99a83158b/zstandard-0.23.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1516c8c37d3a053b01c1c15b182f3b5f5eef19ced9b930b684a73bad121addf4", size = 5341236 }, + { url = "https://files.pythonhosted.org/packages/39/86/4fe79b30c794286110802a6cd44a73b6a314ac8196b9338c0fbd78c2407d/zstandard-0.23.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:48ef6a43b1846f6025dde6ed9fee0c24e1149c1c25f7fb0a0585572b2f3adc58", size = 5439101 }, + { url = "https://files.pythonhosted.org/packages/72/ed/cacec235c581ebf8c608c7fb3d4b6b70d1b490d0e5128ea6996f809ecaef/zstandard-0.23.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:11e3bf3c924853a2d5835b24f03eeba7fc9b07d8ca499e247e06ff5676461a15", size = 4860320 }, + { url = "https://files.pythonhosted.org/packages/f6/1e/2c589a2930f93946b132fc852c574a19d5edc23fad2b9e566f431050c7ec/zstandard-0.23.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:2fb4535137de7e244c230e24f9d1ec194f61721c86ebea04e1581d9d06ea1269", size = 4931933 }, + { url = "https://files.pythonhosted.org/packages/8e/f5/30eadde3686d902b5d4692bb5f286977cbc4adc082145eb3f49d834b2eae/zstandard-0.23.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:8c24f21fa2af4bb9f2c492a86fe0c34e6d2c63812a839590edaf177b7398f700", size = 5463878 }, + { url = "https://files.pythonhosted.org/packages/e0/c8/8aed1f0ab9854ef48e5ad4431367fcb23ce73f0304f7b72335a8edc66556/zstandard-0.23.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:a8c86881813a78a6f4508ef9daf9d4995b8ac2d147dcb1a450448941398091c9", size = 4857192 }, + { url = "https://files.pythonhosted.org/packages/a8/c6/55e666cfbcd032b9e271865e8578fec56e5594d4faeac379d371526514f5/zstandard-0.23.0-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:fe3b385d996ee0822fd46528d9f0443b880d4d05528fd26a9119a54ec3f91c69", size = 4696513 }, + { url = "https://files.pythonhosted.org/packages/dc/bd/720b65bea63ec9de0ac7414c33b9baf271c8de8996e5ff324dc93fc90ff1/zstandard-0.23.0-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:82d17e94d735c99621bf8ebf9995f870a6b3e6d14543b99e201ae046dfe7de70", size = 5204823 }, + { url = "https://files.pythonhosted.org/packages/d8/40/d678db1556e3941d330cd4e95623a63ef235b18547da98fa184cbc028ecf/zstandard-0.23.0-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:c7c517d74bea1a6afd39aa612fa025e6b8011982a0897768a2f7c8ab4ebb78a2", size = 5666490 }, + { url = "https://files.pythonhosted.org/packages/ed/cc/c89329723d7515898a1fc7ef5d251264078548c505719d13e9511800a103/zstandard-0.23.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:1fd7e0f1cfb70eb2f95a19b472ee7ad6d9a0a992ec0ae53286870c104ca939e5", size = 5196622 }, + { url = "https://files.pythonhosted.org/packages/78/4c/634289d41e094327a94500dfc919e58841b10ea3a9efdfafbac614797ec2/zstandard-0.23.0-cp39-cp39-win32.whl", hash = "sha256:43da0f0092281bf501f9c5f6f3b4c975a8a0ea82de49ba3f7100e64d422a1274", size = 430620 }, + { url = "https://files.pythonhosted.org/packages/a2/e2/0b0c5a0f4f7699fecd92c1ba6278ef9b01f2b0b0dd46f62bfc6729c05659/zstandard-0.23.0-cp39-cp39-win_amd64.whl", hash = "sha256:f8346bfa098532bc1fb6c7ef06783e969d87a99dd1d2a5a18a892c1d7a643c58", size = 495528 }, +] diff --git a/hatch_build.py b/hatch_build.py index 3af7375e11b2d..971b71f49bee4 100644 --- a/hatch_build.py +++ b/hatch_build.py @@ -103,7 +103,10 @@ "python-ldap", ], "leveldb": [ - "plyvel", + # The plyvel package is a huge pain when installing on MacOS - especially when Apple releases new + # OS version. It's usually next to impossible to install it at least for a few months after the new + # MacOS version is released. We can skip it on MacOS as this is an optional feature anyway. + "plyvel>=1.5.1; sys_platform != 'darwin'", ], "otel": [ "opentelemetry-exporter-prometheus", diff --git a/pyproject.toml b/pyproject.toml index f738f1e559cc7..c975d95c79f80 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -39,7 +39,7 @@ name = "apache-airflow" description = "Programmatically author, schedule and monitor data pipelines" readme = { file = "generated/PYPI_README.md", content-type = "text/markdown" } license-files.globs = ["LICENSE", "3rd-party-licenses/*.txt"] -requires-python = "~=3.8,<3.13" +requires-python = ">=3.8.1,<3.13" authors = [ { name = "Apache Software Foundation", email = "dev@airflow.apache.org" }, ] diff --git a/scripts/ci/pre_commit/common_precommit_utils.py b/scripts/ci/pre_commit/common_precommit_utils.py index 41bc3a5eeaf93..a70e5cfc848bd 100644 --- a/scripts/ci/pre_commit/common_precommit_utils.py +++ b/scripts/ci/pre_commit/common_precommit_utils.py @@ -118,8 +118,11 @@ def initialize_breeze_precommit(name: str, file: str): if shutil.which("breeze") is None: console.print( "[red]The `breeze` command is not on path.[/]\n\n" - "[yellow]Please install breeze with `pipx install -e ./dev/breeze` from Airflow sources " - "and make sure you run `pipx ensurepath`[/]\n\n" + "[yellow]Please install breeze.\n" + "You can use uv with `uv tool install -e ./dev/breeze or " + "`pipx install -e ./dev/breeze`.\n" + "It will install breeze from Airflow sources " + "(make sure you run `pipx ensurepath` if you use pipx)[/]\n\n" "[bright_blue]You can also set SKIP_BREEZE_PRE_COMMITS env variable to non-empty " "value to skip all breeze tests." ) From 645eee967e10b8d7157d2552379e9725af28655f Mon Sep 17 00:00:00 2001 From: Jens Scheffler <95105677+jscheffl@users.noreply.github.com> Date: Mon, 4 Nov 2024 23:03:59 +0100 Subject: [PATCH 07/44] Fix venv numpy example which needs to be 1.26 at least to be working in Python 3.12 (#43659) --- airflow/example_dags/example_branch_operator_decorator.py | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/airflow/example_dags/example_branch_operator_decorator.py b/airflow/example_dags/example_branch_operator_decorator.py index 59cb3b2919475..66cea26f391e2 100644 --- a/airflow/example_dags/example_branch_operator_decorator.py +++ b/airflow/example_dags/example_branch_operator_decorator.py @@ -117,7 +117,7 @@ def some_ext_py_task(): # Run the example a second time and see that it re-uses it and is faster. VENV_CACHE_PATH = tempfile.gettempdir() - @task.branch_virtualenv(requirements=["numpy~=1.24.4"], venv_cache_path=VENV_CACHE_PATH) + @task.branch_virtualenv(requirements=["numpy~=1.26.0"], venv_cache_path=VENV_CACHE_PATH) def branching_virtualenv(choices) -> str: import random @@ -137,7 +137,7 @@ def branching_virtualenv(choices) -> str: for option in options: @task.virtualenv( - task_id=f"venv_{option}", requirements=["numpy~=1.24.4"], venv_cache_path=VENV_CACHE_PATH + task_id=f"venv_{option}", requirements=["numpy~=1.26.0"], venv_cache_path=VENV_CACHE_PATH ) def some_venv_task(): import numpy as np From 840ed6a9a886c6993ef9b27ab39063948df8d7df Mon Sep 17 00:00:00 2001 From: Jarek Potiuk Date: Tue, 5 Nov 2024 15:26:48 +0100 Subject: [PATCH 08/44] Fix reproducibility of prepared provider packages (fix flit frontend) (#43683) (#43687) After some checks it turned out that reproducibility of produced packages depends not only on the build backend configured for the project but also on the build front-end used - because frontend is the one to modify meta-data in prepared packages - including the build tool used, it's version and metadata version supported by the front-end. That's why in order to maintain reproducibility for anyone who builds the packages, we have to pin not only the build backend in pyproject.toml (flit-core) but also build fronted used (flit). Since package preparation is done with breeze, we can do it by pinning flit (and just in case also flit-core) so that anyone who builds specific version of the package will use exactly the same flit as the person who built the original packages. This way we will avoid reproducibility problems experienced with 1.5.0 release of FAB. (cherry picked from commit 18ea01cef2b92fe820ceaa33be7b44f9f576aad4) --- dev/README_RELEASE_PROVIDER_PACKAGES.md | 1 - dev/breeze/README.md | 2 +- dev/breeze/doc/images/output_build-docs.svg | 28 +++++++++---------- dev/breeze/doc/images/output_build-docs.txt | 2 +- dev/breeze/doc/images/output_prod-image.svg | 2 +- dev/breeze/doc/images/output_prod-image.txt | 2 +- .../doc/images/output_prod-image_build.txt | 2 +- dev/breeze/doc/images/output_setup.svg | 2 +- dev/breeze/doc/images/output_setup.txt | 2 +- .../doc/images/output_setup_autocomplete.svg | 10 +++---- .../doc/images/output_setup_autocomplete.txt | 2 +- dev/breeze/doc/images/output_setup_config.txt | 2 +- .../doc/images/output_start-airflow.txt | 2 +- dev/breeze/pyproject.toml | 15 +++++++++- .../commands/release_candidate_command.py | 1 - .../commands/release_management_commands.py | 4 --- .../airflow_breeze/commands/setup_commands.py | 26 ++++++++++++++++- .../templates/pyproject_TEMPLATE.toml.jinja2 | 3 +- .../airflow_breeze/utils/python_versions.py | 8 +----- .../src/airflow_breeze/utils/reproducible.py | 3 -- dev/breeze/uv.lock | 4 ++- 21 files changed, 73 insertions(+), 50 deletions(-) diff --git a/dev/README_RELEASE_PROVIDER_PACKAGES.md b/dev/README_RELEASE_PROVIDER_PACKAGES.md index 749f89e106320..25aa8062c7722 100644 --- a/dev/README_RELEASE_PROVIDER_PACKAGES.md +++ b/dev/README_RELEASE_PROVIDER_PACKAGES.md @@ -335,7 +335,6 @@ export AIRFLOW_REPO_ROOT=$(pwd -P) rm -rf ${AIRFLOW_REPO_ROOT}/dist/* ``` - * Release candidate packages: ```shell script diff --git a/dev/breeze/README.md b/dev/breeze/README.md index 2c38aa7c1a95e..713bf7ce83fd3 100644 --- a/dev/breeze/README.md +++ b/dev/breeze/README.md @@ -66,6 +66,6 @@ PLEASE DO NOT MODIFY THE HASH BELOW! IT IS AUTOMATICALLY UPDATED BY PRE-COMMIT. --------------------------------------------------------------------------------------------------------- -Package config hash: f8e8729f4236f050d4412cbbc9d53fdd4e6ddad65ce5fafd3c5b6fcdacbea5431eea760b961534a63fd5733b072b38e8167b5b0c12ee48b31c3257306ef11940 +Package config hash: d1d07397099e14c5fc5f0b0e13a87ac8e112bf66755f77cee62b29151cd18c2f2d35932906db6b3885af652defddce696ef9b2df58e21bd3a7749bca82baf910 --------------------------------------------------------------------------------------------------------- diff --git a/dev/breeze/doc/images/output_build-docs.svg b/dev/breeze/doc/images/output_build-docs.svg index e270c6b92e997..6fa9017144472 100644 --- a/dev/breeze/doc/images/output_build-docs.svg +++ b/dev/breeze/doc/images/output_build-docs.svg @@ -203,32 +203,32 @@ Build documents. ╭─ Doc flags ──────────────────────────────────────────────────────────────────────────────────────────────────────────╮ ---docs-only-dOnly build documentation. ---spellcheck-only-sOnly run spell checking. ---clean-buildClean inventories of Inter-Sphinx documentation and generated APIs and sphinx     +--docs-only-dOnly build documentation. +--spellcheck-only-sOnly run spell checking. +--clean-buildClean inventories of Inter-Sphinx documentation and generated APIs and sphinx     artifacts before the build - useful for a clean build.                            ---one-pass-onlyBuilds documentation in one pass only. This is useful for debugging sphinx        +--one-pass-onlyBuilds documentation in one pass only. This is useful for debugging sphinx        errors.                                                                           ---package-filterFilter(s) to use more than one can be specified. You can use glob pattern         +--package-filterFilter(s) to use more than one can be specified. You can use glob pattern         matching the full package name, for example `apache-airflow-providers-*`. Useful  when you want to selectseveral similarly named packages together.                 (TEXT)                                                                            ---include-not-ready-providersWhether to include providers that are not yet ready to be released. ---include-removed-providersWhether to include providers that are removed. ---github-repository-gGitHub repository used to pull, push run images.(TEXT)[default: apache/airflow] ---builderBuildx builder used to perform `docker buildx build` commands.(TEXT) +--include-not-ready-providersWhether to include providers that are not yet ready to be released. +--include-removed-providersWhether to include providers that are removed. +--github-repository-gGitHub repository used to pull, push run images.(TEXT)[default: apache/airflow] +--builderBuildx builder used to perform `docker buildx build` commands.(TEXT) [default: autodetect]                                          ---package-listOptional, contains comma-separated list of package ids that are processed for     +--package-listOptional, contains comma-separated list of package ids that are processed for     documentation building, and document publishing. It is an easier alternative to   adding individual packages as arguments to every command. This overrides the      packages passed as arguments.                                                     (TEXT)                                                                            ╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ ╭─ Common options ─────────────────────────────────────────────────────────────────────────────────────────────────────╮ ---dry-run-DIf dry-run is set, commands are only printed, not executed. ---verbose-vPrint verbose information about performed steps. ---answer-aForce answer to questions.(y | n | q | yes | no | quit) ---help-hShow this message and exit. +--dry-run-DIf dry-run is set, commands are only printed, not executed. +--verbose-vPrint verbose information about performed steps. +--answer-aForce answer to questions.(y | n | q | yes | no | quit) +--help-hShow this message and exit. ╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ diff --git a/dev/breeze/doc/images/output_build-docs.txt b/dev/breeze/doc/images/output_build-docs.txt index 760b6b3d09826..85554fb426c0a 100644 --- a/dev/breeze/doc/images/output_build-docs.txt +++ b/dev/breeze/doc/images/output_build-docs.txt @@ -1 +1 @@ -ac6594538890f8fba65c916aa8672aa1 +91166ce4114ea9c162c139d2aff15886 diff --git a/dev/breeze/doc/images/output_prod-image.svg b/dev/breeze/doc/images/output_prod-image.svg index 6b907c07a6b27..ef8e95626d14a 100644 --- a/dev/breeze/doc/images/output_prod-image.svg +++ b/dev/breeze/doc/images/output_prod-image.svg @@ -98,7 +98,7 @@ Tools that developers can use to manually manage PROD images ╭─ Common options ─────────────────────────────────────────────────────────────────────────────────────────────────────╮ ---help-hShow this message and exit. +--help-hShow this message and exit. ╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ ╭─ Production Image tools ─────────────────────────────────────────────────────────────────────────────────────────────╮ build  Build Production image. Include building multiple images for all or selected Python versions sequentially.  diff --git a/dev/breeze/doc/images/output_prod-image.txt b/dev/breeze/doc/images/output_prod-image.txt index 4e4ac97bd602d..c767ee09d4fd3 100644 --- a/dev/breeze/doc/images/output_prod-image.txt +++ b/dev/breeze/doc/images/output_prod-image.txt @@ -1 +1 @@ -55030fe0d7718eb668fa1a37128647b0 +d91bcc76b14f186e749efe2c6aaa8682 diff --git a/dev/breeze/doc/images/output_prod-image_build.txt b/dev/breeze/doc/images/output_prod-image_build.txt index e1e2a2c9c6c7f..1645f4d547baa 100644 --- a/dev/breeze/doc/images/output_prod-image_build.txt +++ b/dev/breeze/doc/images/output_prod-image_build.txt @@ -1 +1 @@ -88290b22adcd4e5cc9da29aaa8467992 +c243f4de16bc858f6202d88922f00109 diff --git a/dev/breeze/doc/images/output_setup.svg b/dev/breeze/doc/images/output_setup.svg index c747a1eea7f38..5dda408adefbc 100644 --- a/dev/breeze/doc/images/output_setup.svg +++ b/dev/breeze/doc/images/output_setup.svg @@ -110,7 +110,7 @@ Tools that developers can use to configure Breeze ╭─ Common options ─────────────────────────────────────────────────────────────────────────────────────────────────────╮ ---help-hShow this message and exit. +--help-hShow this message and exit. ╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ ╭─ Setup ──────────────────────────────────────────────────────────────────────────────────────────────────────────────╮ autocomplete                   Enables autocompletion of breeze commands.                                          diff --git a/dev/breeze/doc/images/output_setup.txt b/dev/breeze/doc/images/output_setup.txt index b8f9048b91f0b..274751197daaf 100644 --- a/dev/breeze/doc/images/output_setup.txt +++ b/dev/breeze/doc/images/output_setup.txt @@ -1 +1 @@ -d4a4f1b405f912fa234ff4116068290a +08c78d9dddd037a2ade6b751c5a22ff9 diff --git a/dev/breeze/doc/images/output_setup_autocomplete.svg b/dev/breeze/doc/images/output_setup_autocomplete.svg index e118e1fced9a8..31f7814001faa 100644 --- a/dev/breeze/doc/images/output_setup_autocomplete.svg +++ b/dev/breeze/doc/images/output_setup_autocomplete.svg @@ -102,13 +102,13 @@ Enables autocompletion of breeze commands. ╭─ Setup autocomplete flags ───────────────────────────────────────────────────────────────────────────────────────────╮ ---force-fForce autocomplete setup even if already setup before (overrides the setup). +--force-fForce autocomplete setup even if already setup before (overrides the setup). ╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ ╭─ Common options ─────────────────────────────────────────────────────────────────────────────────────────────────────╮ ---verbose-vPrint verbose information about performed steps. ---dry-run-DIf dry-run is set, commands are only printed, not executed. ---answer-aForce answer to questions.(y | n | q | yes | no | quit) ---help-hShow this message and exit. +--verbose-vPrint verbose information about performed steps. +--dry-run-DIf dry-run is set, commands are only printed, not executed. +--answer-aForce answer to questions.(y | n | q | yes | no | quit) +--help-hShow this message and exit. ╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ diff --git a/dev/breeze/doc/images/output_setup_autocomplete.txt b/dev/breeze/doc/images/output_setup_autocomplete.txt index 185feef026464..144c2613cd695 100644 --- a/dev/breeze/doc/images/output_setup_autocomplete.txt +++ b/dev/breeze/doc/images/output_setup_autocomplete.txt @@ -1 +1 @@ -fffcd49e102e09ccd69b3841a9e3ea8e +ec3b4541a478afe5cb86a6f1c48f50f5 diff --git a/dev/breeze/doc/images/output_setup_config.txt b/dev/breeze/doc/images/output_setup_config.txt index 3b2da9a9c043c..695e4b5c871eb 100644 --- a/dev/breeze/doc/images/output_setup_config.txt +++ b/dev/breeze/doc/images/output_setup_config.txt @@ -1 +1 @@ -96e10564034b282769a2c48ebf7176e2 +e77da96b508cc4911857d6f1266802b5 diff --git a/dev/breeze/doc/images/output_start-airflow.txt b/dev/breeze/doc/images/output_start-airflow.txt index 31367c64bfa37..428a70cf0c0a2 100644 --- a/dev/breeze/doc/images/output_start-airflow.txt +++ b/dev/breeze/doc/images/output_start-airflow.txt @@ -1 +1 @@ -2fdb4b01e6d949fb40993e3cc416ca5c +834ca1bef0a55889bfccfeb41738a2f6 diff --git a/dev/breeze/pyproject.toml b/dev/breeze/pyproject.toml index 32b3e1fbe6e32..e7bdbb4db08c6 100644 --- a/dev/breeze/pyproject.toml +++ b/dev/breeze/pyproject.toml @@ -48,7 +48,20 @@ dependencies = [ "black>=23.11.0", "click>=8.1.7", "filelock>=3.13.0", - "flit>=3.5.0", + # + # We pin flit in order to make sure reproducibility of provider packages is maintained + # It turns out that when packages are prepared metadata version in the produced packages + # is taken from the front-end not from the backend, so in order to make sure that the + # packages are reproducible, we should pin both backend in "build-system" and frontend in + # "dependencies" of the environment that is used to build the packages. + # + # TODO(potiuk): automate bumping the version of flit in breeze and sync it with + # the version in the template for provider packages with pre-commit also add instructions in + # the source packages explaining that reproducibility can only be achieved by using the same + # version of flit front-end to build the package + # + "flit==3.10.1", + "flit-core==3.10.1", "gitpython>=3.1.40", "hatch==1.9.4", # Importib_resources 6.2.0-6.3.1 break pytest_rewrite diff --git a/dev/breeze/src/airflow_breeze/commands/release_candidate_command.py b/dev/breeze/src/airflow_breeze/commands/release_candidate_command.py index 8c5c449ed7e86..697526c1af3f8 100644 --- a/dev/breeze/src/airflow_breeze/commands/release_candidate_command.py +++ b/dev/breeze/src/airflow_breeze/commands/release_candidate_command.py @@ -341,7 +341,6 @@ def remove_old_releases(version, repo_root): "--version", required=True, help="The release candidate version e.g. 2.4.3rc1", envvar="VERSION" ) def prepare_airflow_tarball(version: str): - check_python_version() from packaging.version import Version airflow_version = Version(version) diff --git a/dev/breeze/src/airflow_breeze/commands/release_management_commands.py b/dev/breeze/src/airflow_breeze/commands/release_management_commands.py index ce9ec44e57960..9defbe7ef4d41 100644 --- a/dev/breeze/src/airflow_breeze/commands/release_management_commands.py +++ b/dev/breeze/src/airflow_breeze/commands/release_management_commands.py @@ -520,7 +520,6 @@ def prepare_airflow_packages( version_suffix_for_pypi: str, use_local_hatch: bool, ): - check_python_version() perform_environment_checks() fix_ownership_using_docker() cleanup_python_generated_files() @@ -3067,7 +3066,6 @@ def prepare_helm_chart_tarball( ) -> None: import yaml - check_python_version() chart_yaml_file_content = CHART_YAML_FILE.read_text() chart_yaml_dict = yaml.safe_load(chart_yaml_file_content) version_in_chart = chart_yaml_dict["version"] @@ -3209,8 +3207,6 @@ def prepare_helm_chart_tarball( @option_dry_run @option_verbose def prepare_helm_chart_package(sign_email: str): - check_python_version() - import yaml from airflow_breeze.utils.kubernetes_utils import ( diff --git a/dev/breeze/src/airflow_breeze/commands/setup_commands.py b/dev/breeze/src/airflow_breeze/commands/setup_commands.py index bc1ac4f1fa56b..f0d1e4eac7c94 100644 --- a/dev/breeze/src/airflow_breeze/commands/setup_commands.py +++ b/dev/breeze/src/airflow_breeze/commands/setup_commands.py @@ -22,6 +22,7 @@ import shutil import subprocess import sys +import textwrap from copy import copy from pathlib import Path from typing import Any @@ -274,8 +275,31 @@ def get_status(file: str): get_console().print() -def dict_hash(dictionary: dict[str, Any]) -> str: +def dedent_help(dictionary: dict[str, Any]) -> None: + """ + Dedent help stored in the dictionary. + + Python 3.13 automatically dedents docstrings retrieved from functions. + See https://github.com/python/cpython/issues/81283 + + However, click uses docstrings in the absence of help strings, and we are using click + command definition dictionary hash to detect changes in the command definitions, so if the + help strings are not dedented, the hash will change. + + That's why we must de-dent all the help strings in the command definition dictionary + before we hash it. + """ + for key, value in dictionary.items(): + if isinstance(value, dict): + dedent_help(value) + elif key == "help" and isinstance(value, str): + dictionary[key] = textwrap.dedent(value) + + +def dict_hash(dictionary: dict[str, Any], dedent_help_strings: bool = True) -> str: """MD5 hash of a dictionary. Sorted and dumped via json to account for random sequence)""" + if dedent_help_strings: + dedent_help(dictionary) # noinspection InsecureHash dhash = hashlib.md5() try: diff --git a/dev/breeze/src/airflow_breeze/templates/pyproject_TEMPLATE.toml.jinja2 b/dev/breeze/src/airflow_breeze/templates/pyproject_TEMPLATE.toml.jinja2 index 389d2ce62e578..a375ffedc63ef 100644 --- a/dev/breeze/src/airflow_breeze/templates/pyproject_TEMPLATE.toml.jinja2 +++ b/dev/breeze/src/airflow_breeze/templates/pyproject_TEMPLATE.toml.jinja2 @@ -39,9 +39,8 @@ # IF YOU WANT TO MODIFY THIS FILE, YOU SHOULD MODIFY THE TEMPLATE # `pyproject_TEMPLATE.toml.jinja2` IN the `dev/breeze/src/airflow_breeze/templates` DIRECTORY -# [build-system] -requires = ["flit_core >=3.2,<4"] +requires = ["flit_core==3.10.1"] build-backend = "flit_core.buildapi" [project] diff --git a/dev/breeze/src/airflow_breeze/utils/python_versions.py b/dev/breeze/src/airflow_breeze/utils/python_versions.py index b8807e66bf87c..d84c4f932ba8d 100644 --- a/dev/breeze/src/airflow_breeze/utils/python_versions.py +++ b/dev/breeze/src/airflow_breeze/utils/python_versions.py @@ -46,16 +46,10 @@ def get_python_version_list(python_versions: str) -> list[str]: def check_python_version(): - error = False if not sys.version_info >= (3, 9): get_console().print("[error]At least Python 3.9 is required to prepare reproducible archives.\n") - error = True - elif not sys.version_info < (3, 12): - get_console().print("[error]Python 3.12 is not supported.\n") - error = True - if error: get_console().print( - "[warning]Please reinstall Breeze using Python 3.9 - 3.11 environment.[/]\n\n" + "[warning]Please reinstall Breeze using Python 3.9 - 3.12 environment.[/]\n\n" "If you are using uv:\n\n" " uv tool install --force --reinstall --python 3.9 -e ./dev/breeze\n\n" "If you are using pipx:\n\n" diff --git a/dev/breeze/src/airflow_breeze/utils/reproducible.py b/dev/breeze/src/airflow_breeze/utils/reproducible.py index 1429333d64152..cf4005d9ddd10 100644 --- a/dev/breeze/src/airflow_breeze/utils/reproducible.py +++ b/dev/breeze/src/airflow_breeze/utils/reproducible.py @@ -43,7 +43,6 @@ from subprocess import CalledProcessError, CompletedProcess from airflow_breeze.utils.path_utils import AIRFLOW_SOURCES_ROOT, OUT_DIR, REPRODUCIBLE_DIR -from airflow_breeze.utils.python_versions import check_python_version from airflow_breeze.utils.run_utils import run_command @@ -91,7 +90,6 @@ def reset(tarinfo): tarinfo.mtime = timestamp return tarinfo - check_python_version() OUT_DIR.mkdir(exist_ok=True) shutil.rmtree(REPRODUCIBLE_DIR, ignore_errors=True) REPRODUCIBLE_DIR.mkdir(exist_ok=True) @@ -149,7 +147,6 @@ def reset(tarinfo): def main(): - check_python_version() parser = ArgumentParser() parser.add_argument("-a", "--archive", help="archive to repack") parser.add_argument("-o", "--out", help="archive destination") diff --git a/dev/breeze/uv.lock b/dev/breeze/uv.lock index 666cb37805254..a5a252063646c 100644 --- a/dev/breeze/uv.lock +++ b/dev/breeze/uv.lock @@ -25,6 +25,7 @@ dependencies = [ { name = "click" }, { name = "filelock" }, { name = "flit" }, + { name = "flit-core" }, { name = "gitpython" }, { name = "hatch" }, { name = "importlib-resources", marker = "python_full_version < '3.9'" }, @@ -53,7 +54,8 @@ requires-dist = [ { name = "black", specifier = ">=23.11.0" }, { name = "click", specifier = ">=8.1.7" }, { name = "filelock", specifier = ">=3.13.0" }, - { name = "flit", specifier = ">=3.5.0" }, + { name = "flit", specifier = "==3.10.1" }, + { name = "flit-core", specifier = "==3.10.1" }, { name = "gitpython", specifier = ">=3.1.40" }, { name = "hatch", specifier = "==1.9.4" }, { name = "importlib-resources", marker = "python_full_version < '3.9'", specifier = ">=5.2,!=6.2.0,!=6.3.0,!=6.3.1" }, From ae03ab4081c6b58df2e2ff3dc2a4d922cda3babe Mon Sep 17 00:00:00 2001 From: Jarek Potiuk Date: Tue, 5 Nov 2024 15:45:50 +0100 Subject: [PATCH 09/44] Detect situation where Breeze is installed with both pipx and uv (#43694) (#43695) When breeze is installed with both - pipx and uv, we do not know which version is available first on the path and self-upgrading breeze might not upgrade the one that is first. Therefore we detect that situation and fail self upgrade with appropriate instructions what to do (recommending leaving uv as faster) (cherry picked from commit ccd65867387117cd4503715195a877a1ac2892a2) --- .../src/airflow_breeze/utils/reinstall.py | 38 +++++++++++++++---- 1 file changed, 31 insertions(+), 7 deletions(-) diff --git a/dev/breeze/src/airflow_breeze/utils/reinstall.py b/dev/breeze/src/airflow_breeze/utils/reinstall.py index 6165c8a307201..6fdf994c6e91c 100644 --- a/dev/breeze/src/airflow_breeze/utils/reinstall.py +++ b/dev/breeze/src/airflow_breeze/utils/reinstall.py @@ -37,14 +37,38 @@ def reinstall_breeze(breeze_sources: Path, re_run: bool = True): # Breeze from different sources than originally installed (i.e. when we reinstall airflow # From the current directory. get_console().print(f"\n[info]Reinstalling Breeze from {breeze_sources}\n") - result = subprocess.run(["uv", "tool", "list"], text=True, capture_output=True, check=False) - if result.returncode == 0: - if "apache-airflow-breeze" in result.stdout: - subprocess.check_call( - ["uv", "tool", "install", "--force", "--reinstall", "-e", breeze_sources.as_posix()] - ) - else: + breeze_installed_with_uv = False + breeze_installed_with_pipx = False + result_uv = subprocess.run(["uv", "tool", "list"], text=True, capture_output=True, check=False) + if result_uv.returncode == 0: + if "apache-airflow-breeze" in result_uv.stdout: + breeze_installed_with_uv = True + result_pipx = subprocess.run(["pipx", "list"], text=True, capture_output=True, check=False) + if result_pipx.returncode == 0: + if "apache-airflow-breeze" in result_pipx.stdout: + breeze_installed_with_pipx = True + if breeze_installed_with_uv and breeze_installed_with_pipx: + get_console().print( + "[error]Breeze is installed both with `uv` and `pipx`. This is not supported.[/]\n" + ) + get_console().print( + "[info]Please uninstall Breeze and install it only with one of the methods[/]\n" + "[info]The `uv` installation method is recommended as it is much faster[/]\n" + ) + get_console().print( + "To uninstall Breeze installed with pipx run:\n pipx uninstall apache-airflow-breeze\n" + ) + get_console().print( + "To uninstall Breeze installed with uv run:\n uv tool uninstall apache-airflow-breeze\n" + ) + sys.exit(1) + elif breeze_installed_with_uv: + subprocess.check_call( + ["uv", "tool", "install", "--force", "--reinstall", "-e", breeze_sources.as_posix()] + ) + elif breeze_installed_with_pipx: subprocess.check_call(["pipx", "install", "-e", breeze_sources.as_posix(), "--force"]) + if re_run: # Make sure we don't loop forever if the metadata hash hasn't been updated yet (else it is tricky to # run pre-commit checks via breeze!) From be22e95ed359905642353ccf51ba2b07869346ec Mon Sep 17 00:00:00 2001 From: Pierre Jeambrun Date: Tue, 5 Nov 2024 23:28:16 +0800 Subject: [PATCH 10/44] Disable XCom list ordering by execution_date (#43680) (#43696) * Disable XCom list ordering by execution_date * Update airflow/www/views.py Co-authored-by: Kaxil Naik --------- Co-authored-by: Kaxil Naik (cherry picked from commit c96b618b60ed049658470a9696479c0df36957af) --- airflow/www/views.py | 11 +++++++++++ 1 file changed, 11 insertions(+) diff --git a/airflow/www/views.py b/airflow/www/views.py index 5e8ef6bb7f08c..bb88da2cdfa9c 100644 --- a/airflow/www/views.py +++ b/airflow/www/views.py @@ -4024,6 +4024,17 @@ class XComModelView(AirflowModelView): list_columns = ["key", "value", "timestamp", "dag_id", "task_id", "run_id", "map_index", "execution_date"] base_order = ("dag_run_id", "desc") + order_columns = [ + "key", + "value", + "timestamp", + "dag_id", + "task_id", + "run_id", + "map_index", + # "execution_date", # execution_date sorting is not working and crashing the UI, disabled for now. + ] + base_filters = [["dag_id", DagFilter, list]] formatters_columns = { From 037fa9c30555e7f1522fb4e14c10b5743c4c77fe Mon Sep 17 00:00:00 2001 From: Jarek Potiuk Date: Tue, 5 Nov 2024 19:17:15 +0100 Subject: [PATCH 11/44] Handle FileNotFound Error returned by missing uv or pipx (#43714) (#43715) Subprocess.run raises FileNotFound when uv or pipx are not installed at all. This PR will handle it. (cherry picked from commit ed3accb30086b9ed5eddcd12b17a5f7c8d52d53b) --- .../src/airflow_breeze/utils/reinstall.py | 22 ++++++++++++------- 1 file changed, 14 insertions(+), 8 deletions(-) diff --git a/dev/breeze/src/airflow_breeze/utils/reinstall.py b/dev/breeze/src/airflow_breeze/utils/reinstall.py index 6fdf994c6e91c..4030235e68e81 100644 --- a/dev/breeze/src/airflow_breeze/utils/reinstall.py +++ b/dev/breeze/src/airflow_breeze/utils/reinstall.py @@ -39,14 +39,20 @@ def reinstall_breeze(breeze_sources: Path, re_run: bool = True): get_console().print(f"\n[info]Reinstalling Breeze from {breeze_sources}\n") breeze_installed_with_uv = False breeze_installed_with_pipx = False - result_uv = subprocess.run(["uv", "tool", "list"], text=True, capture_output=True, check=False) - if result_uv.returncode == 0: - if "apache-airflow-breeze" in result_uv.stdout: - breeze_installed_with_uv = True - result_pipx = subprocess.run(["pipx", "list"], text=True, capture_output=True, check=False) - if result_pipx.returncode == 0: - if "apache-airflow-breeze" in result_pipx.stdout: - breeze_installed_with_pipx = True + try: + result_uv = subprocess.run(["uv", "tool", "list"], text=True, capture_output=True, check=False) + if result_uv.returncode == 0: + if "apache-airflow-breeze" in result_uv.stdout: + breeze_installed_with_uv = True + except FileNotFoundError: + pass + try: + result_pipx = subprocess.run(["pipx", "list"], text=True, capture_output=True, check=False) + if result_pipx.returncode == 0: + if "apache-airflow-breeze" in result_pipx.stdout: + breeze_installed_with_pipx = True + except FileNotFoundError: + pass if breeze_installed_with_uv and breeze_installed_with_pipx: get_console().print( "[error]Breeze is installed both with `uv` and `pipx`. This is not supported.[/]\n" From b7ae6565d30ef595861aadbf3a930f7136aa5797 Mon Sep 17 00:00:00 2001 From: Elad Kalif <45845474+eladkal@users.noreply.github.com> Date: Wed, 6 Nov 2024 14:55:16 +0200 Subject: [PATCH 12/44] Remove note about MySQL 5 (#43729) --- README.md | 4 +--- generated/PYPI_README.md | 4 +--- 2 files changed, 2 insertions(+), 6 deletions(-) diff --git a/README.md b/README.md index 754ed9341739d..2d3145ff6c706 100644 --- a/README.md +++ b/README.md @@ -108,9 +108,7 @@ Apache Airflow is tested with: \* Experimental -**Note**: MySQL 5.x versions are unable to or have limitations with -running multiple schedulers -- please see the [Scheduler docs](https://airflow.apache.org/docs/apache-airflow/stable/administration-and-deployment/scheduler.html). -MariaDB is not tested/recommended. +**Note**: MariaDB is not tested/recommended. **Note**: SQLite is used in Airflow tests. Do not use it in production. We recommend using the latest stable version of SQLite for local development. diff --git a/generated/PYPI_README.md b/generated/PYPI_README.md index 5e43edded8fd5..a9f9ff42f0abb 100644 --- a/generated/PYPI_README.md +++ b/generated/PYPI_README.md @@ -65,9 +65,7 @@ Apache Airflow is tested with: \* Experimental -**Note**: MySQL 5.x versions are unable to or have limitations with -running multiple schedulers -- please see the [Scheduler docs](https://airflow.apache.org/docs/apache-airflow/stable/administration-and-deployment/scheduler.html). -MariaDB is not tested/recommended. +**Note**: MariaDB is not tested/recommended. **Note**: SQLite is used in Airflow tests. Do not use it in production. We recommend using the latest stable version of SQLite for local development. From 7e1989ab221985540dd48543158e1cb92ffad873 Mon Sep 17 00:00:00 2001 From: Jens Scheffler <95105677+jscheffl@users.noreply.github.com> Date: Sat, 9 Nov 2024 19:54:03 +0100 Subject: [PATCH 13/44] #43252 Disable extra links button if link is null or empty (#43844) (#43851) * Disable button if link is null or empty * Fix space --------- Co-authored-by: Enis Nazif (cherry picked from commit de8818270095d9f05edfec8e03daa5df7d6a773b) Co-authored-by: enisnazif --- airflow/www/static/js/dag/details/taskInstance/ExtraLinks.tsx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/airflow/www/static/js/dag/details/taskInstance/ExtraLinks.tsx b/airflow/www/static/js/dag/details/taskInstance/ExtraLinks.tsx index 06528eab6e7a1..1eb59cb9b1a65 100644 --- a/airflow/www/static/js/dag/details/taskInstance/ExtraLinks.tsx +++ b/airflow/www/static/js/dag/details/taskInstance/ExtraLinks.tsx @@ -55,7 +55,7 @@ const ExtraLinks = ({ const isSanitised = (url: string | null) => { if (!url) { - return true; + return false; // Empty or null urls should cause the link to be disabled } const urlRegex = /^(https?:)/i; return urlRegex.test(url); From 8521bf36a7ec156346272982874f245eae667b5a Mon Sep 17 00:00:00 2001 From: Jens Scheffler <95105677+jscheffl@users.noreply.github.com> Date: Tue, 12 Nov 2024 01:46:04 +0100 Subject: [PATCH 14/44] Correct mime-type (#43879) (#43901) (cherry picked from commit 45cbad79bd5f8797b26c0fde83299eaf60e5fd0d) Co-authored-by: xitep --- airflow/api_connexion/openapi/v1.yaml | 2 +- airflow/www/static/js/types/api-generated.ts | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/airflow/api_connexion/openapi/v1.yaml b/airflow/api_connexion/openapi/v1.yaml index 74ef2121c3780..6edfce9475946 100644 --- a/airflow/api_connexion/openapi/v1.yaml +++ b/airflow/api_connexion/openapi/v1.yaml @@ -2274,7 +2274,7 @@ paths: properties: content: type: string - plain/text: + text/plain: schema: type: string diff --git a/airflow/www/static/js/types/api-generated.ts b/airflow/www/static/js/types/api-generated.ts index fac52cf954732..3f4935fa6cede 100644 --- a/airflow/www/static/js/types/api-generated.ts +++ b/airflow/www/static/js/types/api-generated.ts @@ -4886,7 +4886,7 @@ export interface operations { "application/json": { content?: string; }; - "plain/text": string; + "text/plain": string; }; }; 401: components["responses"]["Unauthenticated"]; From 02069884b5da8e5019e538ce6cf99d0c0d75f3ad Mon Sep 17 00:00:00 2001 From: Tzu-ping Chung Date: Tue, 12 Nov 2024 17:57:26 +0800 Subject: [PATCH 15/44] Tweak strict_dataset_uri_validation documentation wording (#43918) --- airflow/config_templates/config.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/airflow/config_templates/config.yml b/airflow/config_templates/config.yml index 00d7cacc7f47a..613c5e3394a40 100644 --- a/airflow/config_templates/config.yml +++ b/airflow/config_templates/config.yml @@ -494,7 +494,7 @@ core: description: | Dataset URI validation should raise an exception if it is not compliant with AIP-60. By default this configuration is false, meaning that Airflow 2.x only warns the user. - In Airflow 3, this configuration will be enabled by default. + In Airflow 3, this configuration will be removed, unconditionally enabling strict validation. default: "False" example: ~ version_added: 2.9.2 From 26ac832bfa3578365dd65f319801bebb174f3bb4 Mon Sep 17 00:00:00 2001 From: Ephraim Anierobi Date: Wed, 13 Nov 2024 09:48:24 +0100 Subject: [PATCH 16/44] Fix duplication of Task tries in the UI (#43891) (#43950) It was observed that there are moments where the TI tries endpoint returns duplicate TaskInstance. I have observed this to happen when the TI is in up_for_retry state. When the TI is in up_for_retry state, we have already recorded the previous try in TI history and the TI try_number has not incremented at this time, so we must exclude this recorded TI from the taskinstance tries endpoint. We know the TI because its state is in up_for_retry, so we filter TIs with up_for_retry state when querying for the task instance tries. Closes: #41765 (cherry picked from commit 4bc1257df4bf1f7391ad8bca3b10d294b2d92e7a) --- .../endpoints/task_instance_endpoint.py | 7 ++++++- .../endpoints/test_task_instance_endpoint.py | 17 +++++++++++++++++ 2 files changed, 23 insertions(+), 1 deletion(-) diff --git a/airflow/api_connexion/endpoints/task_instance_endpoint.py b/airflow/api_connexion/endpoints/task_instance_endpoint.py index a79af61f69bed..2eb63260e348e 100644 --- a/airflow/api_connexion/endpoints/task_instance_endpoint.py +++ b/airflow/api_connexion/endpoints/task_instance_endpoint.py @@ -840,7 +840,12 @@ def _query(orm_object): ) return query - task_instances = session.scalars(_query(TIH)).all() + session.scalars(_query(TI)).all() + # Exclude TaskInstance with state UP_FOR_RETRY since they have been recorded in TaskInstanceHistory + tis = session.scalars( + _query(TI).where(or_(TI.state != TaskInstanceState.UP_FOR_RETRY, TI.state.is_(None))) + ).all() + + task_instances = session.scalars(_query(TIH)).all() + tis return task_instance_history_collection_schema.dump( TaskInstanceHistoryCollection(task_instances=task_instances, total_entries=len(task_instances)) ) diff --git a/tests/api_connexion/endpoints/test_task_instance_endpoint.py b/tests/api_connexion/endpoints/test_task_instance_endpoint.py index 330b69c386739..bf81584caf7c8 100644 --- a/tests/api_connexion/endpoints/test_task_instance_endpoint.py +++ b/tests/api_connexion/endpoints/test_task_instance_endpoint.py @@ -3013,6 +3013,23 @@ def test_should_respond_200(self, session): assert response.json["total_entries"] == 2 # The task instance and its history assert len(response.json["task_instances"]) == 2 + def test_ti_in_retry_state_not_returned(self, session): + self.create_task_instances( + session=session, task_instances=[{"state": State.SUCCESS}], with_ti_history=True + ) + ti = session.query(TaskInstance).one() + ti.state = State.UP_FOR_RETRY + session.merge(ti) + session.commit() + + response = self.client.get( + "/api/v1/dags/example_python_operator/dagRuns/TEST_DAG_RUN_ID/taskInstances/print_the_context/tries", + environ_overrides={"REMOTE_USER": "test"}, + ) + assert response.status_code == 200 + assert response.json["total_entries"] == 1 + assert len(response.json["task_instances"]) == 1 + def test_mapped_task_should_respond_200(self, session): tis = self.create_task_instances(session, task_instances=[{"state": State.FAILED}]) old_ti = tis[0] From 16f6796d7b52c67dfd6117496c3ad6bace5fc61a Mon Sep 17 00:00:00 2001 From: Wei Lee Date: Thu, 14 Nov 2024 08:36:49 +0800 Subject: [PATCH 17/44] feat(dataset): raise deprecation warning when accessing inlet or outlet events through str (#43922) this behavior will be removed in airflow 3 as assets have attributes name and uri, it would be confusing to identify which attribute should be used to filter the right asset --- airflow/example_dags/example_dataset_alias.py | 2 +- .../example_dataset_alias_with_no_taskflow.py | 4 ++- .../example_dags/example_inlet_event_extra.py | 2 +- airflow/utils/context.py | 27 +++++++++++++++++++ .../authoring-and-scheduling/datasets.rst | 14 +++++----- tests/models/test_taskinstance.py | 18 ++++++------- 6 files changed, 49 insertions(+), 18 deletions(-) diff --git a/airflow/example_dags/example_dataset_alias.py b/airflow/example_dags/example_dataset_alias.py index c50a89e34fb8c..4bfc6f51a7351 100644 --- a/airflow/example_dags/example_dataset_alias.py +++ b/airflow/example_dags/example_dataset_alias.py @@ -67,7 +67,7 @@ def produce_dataset_events(): def produce_dataset_events_through_dataset_alias(*, outlet_events=None): bucket_name = "bucket" object_path = "my-task" - outlet_events["example-alias"].add(Dataset(f"s3://{bucket_name}/{object_path}")) + outlet_events[DatasetAlias("example-alias")].add(Dataset(f"s3://{bucket_name}/{object_path}")) produce_dataset_events_through_dataset_alias() diff --git a/airflow/example_dags/example_dataset_alias_with_no_taskflow.py b/airflow/example_dags/example_dataset_alias_with_no_taskflow.py index 7d7227af39f50..72863618e3949 100644 --- a/airflow/example_dags/example_dataset_alias_with_no_taskflow.py +++ b/airflow/example_dags/example_dataset_alias_with_no_taskflow.py @@ -68,7 +68,9 @@ def produce_dataset_events(): def produce_dataset_events_through_dataset_alias_with_no_taskflow(*, outlet_events=None): bucket_name = "bucket" object_path = "my-task" - outlet_events["example-alias-no-taskflow"].add(Dataset(f"s3://{bucket_name}/{object_path}")) + outlet_events[DatasetAlias("example-alias-no-taskflow")].add( + Dataset(f"s3://{bucket_name}/{object_path}") + ) PythonOperator( task_id="produce_dataset_events_through_dataset_alias_with_no_taskflow", diff --git a/airflow/example_dags/example_inlet_event_extra.py b/airflow/example_dags/example_inlet_event_extra.py index 4b7567fc2f87e..b07faf2bdfe0b 100644 --- a/airflow/example_dags/example_inlet_event_extra.py +++ b/airflow/example_dags/example_inlet_event_extra.py @@ -57,5 +57,5 @@ def read_dataset_event(*, inlet_events=None): BashOperator( task_id="read_dataset_event_from_classic", inlets=[ds], - bash_command="echo '{{ inlet_events['s3://output/1.txt'][-1].extra | tojson }}'", + bash_command="echo '{{ inlet_events[Dataset('s3://output/1.txt')][-1].extra | tojson }}'", ) diff --git a/airflow/utils/context.py b/airflow/utils/context.py index a72885401f7b2..9dddcc3f16cd8 100644 --- a/airflow/utils/context.py +++ b/airflow/utils/context.py @@ -177,6 +177,14 @@ class OutletEventAccessor: def add(self, dataset: Dataset | str, extra: dict[str, Any] | None = None) -> None: """Add a DatasetEvent to an existing Dataset.""" if isinstance(dataset, str): + warnings.warn( + ( + "Emitting dataset events using string is deprecated and will be removed in Airflow 3. " + "Please use the Dataset object (renamed as Asset in Airflow 3) directly" + ), + DeprecationWarning, + stacklevel=2, + ) dataset_uri = dataset elif isinstance(dataset, Dataset): dataset_uri = dataset.uri @@ -216,6 +224,16 @@ def __len__(self) -> int: return len(self._dict) def __getitem__(self, key: str | Dataset | DatasetAlias) -> OutletEventAccessor: + if isinstance(key, str): + warnings.warn( + ( + "Accessing outlet_events using string is deprecated and will be removed in Airflow 3. " + "Please use the Dataset or DatasetAlias object (renamed as Asset and AssetAlias in Airflow 3) directly" + ), + DeprecationWarning, + stacklevel=2, + ) + event_key = extract_event_key(key) if event_key not in self._dict: self._dict[event_key] = OutletEventAccessor(extra={}, raw_key=key) @@ -282,6 +300,15 @@ def __getitem__(self, key: int | str | Dataset | DatasetAlias) -> LazyDatasetEve join_clause = DatasetEvent.source_aliases where_clause = DatasetAliasModel.name == dataset_alias.name elif isinstance(obj, (Dataset, str)): + if isinstance(obj, str): + warnings.warn( + ( + "Accessing inlet_events using string is deprecated and will be removed in Airflow 3. " + "Please use the Dataset object (renamed as Asset in Airflow 3) directly" + ), + DeprecationWarning, + stacklevel=2, + ) dataset = self._datasets[extract_event_key(obj)] join_clause = DatasetEvent.dataset where_clause = DatasetModel.uri == dataset.uri diff --git a/docs/apache-airflow/authoring-and-scheduling/datasets.rst b/docs/apache-airflow/authoring-and-scheduling/datasets.rst index a69c09bc13b0f..c5d117ab5a5a1 100644 --- a/docs/apache-airflow/authoring-and-scheduling/datasets.rst +++ b/docs/apache-airflow/authoring-and-scheduling/datasets.rst @@ -432,7 +432,7 @@ The following example creates a dataset event against the S3 URI ``f"s3://bucket @task(outlets=[DatasetAlias("my-task-outputs")]) def my_task_with_outlet_events(*, outlet_events): - outlet_events["my-task-outputs"].add(Dataset("s3://bucket/my-task"), extra={"k": "v"}) + outlet_events[DatasetAlias("my-task-outputs")].add(Dataset("s3://bucket/my-task"), extra={"k": "v"}) **Emit a dataset event during task execution through yielding Metadata** @@ -462,11 +462,11 @@ Only one dataset event is emitted for an added dataset, even if it is added to t ] ) def my_task_with_outlet_events(*, outlet_events): - outlet_events["my-task-outputs-1"].add(Dataset("s3://bucket/my-task"), extra={"k": "v"}) + outlet_events[DatasetAlias("my-task-outputs-1")].add(Dataset("s3://bucket/my-task"), extra={"k": "v"}) # This line won't emit an additional dataset event as the dataset and extra are the same as the previous line. - outlet_events["my-task-outputs-2"].add(Dataset("s3://bucket/my-task"), extra={"k": "v"}) + outlet_events[DatasetAlias("my-task-outputs-2")].add(Dataset("s3://bucket/my-task"), extra={"k": "v"}) # This line will emit an additional dataset event as the extra is different. - outlet_events["my-task-outputs-3"].add(Dataset("s3://bucket/my-task"), extra={"k2": "v2"}) + outlet_events[DatasetAlias("my-task-outputs-3")].add(Dataset("s3://bucket/my-task"), extra={"k2": "v2"}) Scheduling based on dataset aliases ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ @@ -487,7 +487,7 @@ The dataset alias is resolved to the datasets during DAG parsing. Thus, if the " @task(outlets=[DatasetAlias("example-alias")]) def produce_dataset_events(*, outlet_events): - outlet_events["example-alias"].add(Dataset("s3://bucket/my-task")) + outlet_events[DatasetAlias("example-alias")].add(Dataset("s3://bucket/my-task")) with DAG(dag_id="dataset-consumer", schedule=Dataset("s3://bucket/my-task")): @@ -511,7 +511,9 @@ As mentioned in :ref:`Fetching information from previously emitted dataset event @task(outlets=[DatasetAlias("example-alias")]) def produce_dataset_events(*, outlet_events): - outlet_events["example-alias"].add(Dataset("s3://bucket/my-task"), extra={"row_count": 1}) + outlet_events[DatasetAlias("example-alias")].add( + Dataset("s3://bucket/my-task"), extra={"row_count": 1} + ) with DAG(dag_id="dataset-alias-consumer", schedule=None): diff --git a/tests/models/test_taskinstance.py b/tests/models/test_taskinstance.py index 468dc2c9300ca..b5dcb43de262d 100644 --- a/tests/models/test_taskinstance.py +++ b/tests/models/test_taskinstance.py @@ -2411,7 +2411,7 @@ def test_outlet_dataset_extra(self, dag_maker, session): @task(outlets=Dataset("test_outlet_dataset_extra_1")) def write1(*, outlet_events): - outlet_events["test_outlet_dataset_extra_1"].extra = {"foo": "bar"} + outlet_events[Dataset("test_outlet_dataset_extra_1")].extra = {"foo": "bar"} write1() @@ -2453,8 +2453,8 @@ def test_outlet_dataset_extra_ignore_different(self, dag_maker, session): @task(outlets=Dataset("test_outlet_dataset_extra")) def write(*, outlet_events): - outlet_events["test_outlet_dataset_extra"].extra = {"one": 1} - outlet_events["different_uri"].extra = {"foo": "bar"} # Will be silently dropped. + outlet_events[Dataset("test_outlet_dataset_extra")].extra = {"one": 1} + outlet_events[Dataset("different_uri")].extra = {"foo": "bar"} # Will be silently dropped. write() @@ -2722,22 +2722,22 @@ def test_inlet_dataset_extra(self, dag_maker, session): @task(outlets=Dataset("test_inlet_dataset_extra")) def write(*, ti, outlet_events): - outlet_events["test_inlet_dataset_extra"].extra = {"from": ti.task_id} + outlet_events[Dataset("test_inlet_dataset_extra")].extra = {"from": ti.task_id} @task(inlets=Dataset("test_inlet_dataset_extra")) def read(*, inlet_events): - second_event = inlet_events["test_inlet_dataset_extra"][1] + second_event = inlet_events[Dataset("test_inlet_dataset_extra")][1] assert second_event.uri == "test_inlet_dataset_extra" assert second_event.extra == {"from": "write2"} - last_event = inlet_events["test_inlet_dataset_extra"][-1] + last_event = inlet_events[Dataset("test_inlet_dataset_extra")][-1] assert last_event.uri == "test_inlet_dataset_extra" assert last_event.extra == {"from": "write3"} with pytest.raises(KeyError): - inlet_events["does_not_exist"] + inlet_events[Dataset("does_not_exist")] with pytest.raises(IndexError): - inlet_events["test_inlet_dataset_extra"][5] + inlet_events[Dataset("test_inlet_dataset_extra")][5] # TODO: Support slices. @@ -2798,7 +2798,7 @@ def read(*, inlet_events): assert last_event.extra == {"from": "write3"} with pytest.raises(KeyError): - inlet_events["does_not_exist"] + inlet_events[Dataset("does_not_exist")] with pytest.raises(KeyError): inlet_events[DatasetAlias("does_not_exist")] with pytest.raises(IndexError): From 8e842fac9a351ebf62b4104a651739f916df27e1 Mon Sep 17 00:00:00 2001 From: Jarek Potiuk Date: Sun, 17 Nov 2024 00:57:33 +0000 Subject: [PATCH 18/44] [v2-10-test] Add .dockerignore to target workflow override (#43885) (#44103) There is an extra layer of protection that code provided by PR should not be executed in the context of pull_request_target by running the code only inside docker container. However the container is build from local sources, so it could contain other code. We do not allow that by .dockerignore, but the .dockerignore should not be overrideable from the incoming PR. (cherry picked from commit 5d6b836c61235765bfdf7ce65f58231e948b0881) --- .github/actions/checkout_target_commit/action.yml | 7 +++++-- 1 file changed, 5 insertions(+), 2 deletions(-) diff --git a/.github/actions/checkout_target_commit/action.yml b/.github/actions/checkout_target_commit/action.yml index e90ae0199804c..e95e8b86254a0 100644 --- a/.github/actions/checkout_target_commit/action.yml +++ b/.github/actions/checkout_target_commit/action.yml @@ -65,13 +65,16 @@ runs: rm -rfv "dev" rm -rfv ".github/actions" rm -rfv ".github/workflows" + rm -v ".dockerignore" || true mv -v "target-airflow/scripts/ci" "scripts" mv -v "target-airflow/dev" "." mv -v "target-airflow/.github/actions" "target-airflow/.github/workflows" ".github" + mv -v "target-airflow/.dockerignore" ".dockerignore" || true if: inputs.pull-request-target == 'true' && inputs.is-committer-build != 'true' #################################################################################################### - # AFTER IT'S SAFE. THE `dev`, `scripts/ci` AND `.github/actions` ARE NOW COMING FROM THE - # BASE_REF - WHICH IS THE TARGET BRANCH OF THE PR. WE CAN TRUST THAT THOSE SCRIPTS ARE SAFE TO RUN. + # AFTER IT'S SAFE. THE `dev`, `scripts/ci` AND `.github/actions` and `.dockerignore` ARE NOW COMING + # FROM THE BASE_REF - WHICH IS THE TARGET BRANCH OF THE PR. WE CAN TRUST THAT THOSE SCRIPTS ARE + # SAFE TO RUN AND CODE AVAILABLE IN THE DOCKER BUILD PHASE IS CONTROLLED BY THE `.dockerignore`. # ALL THE REST OF THE CODE COMES FROM THE PR, AND FOR EXAMPLE THE CODE IN THE `Dockerfile.ci` CAN # BE RUN SAFELY AS PART OF DOCKER BUILD. BECAUSE IT RUNS INSIDE THE DOCKER CONTAINER AND IT IS # ISOLATED FROM THE RUNNER. From 2db983933bde98d7f5875afa37ad6cd1366a59c8 Mon Sep 17 00:00:00 2001 From: Jens Scheffler <95105677+jscheffl@users.noreply.github.com> Date: Sun, 17 Nov 2024 15:00:05 +0100 Subject: [PATCH 19/44] Log message source details are grouped (#43681) (#44070) * Log message source details are grouped (#43681) * Log message source details are grouped * fix static checks * fix pytests * Another pytest fix --------- Co-authored-by: Majoros Donat (XC-DX/EET2-Bp) (cherry picked from commit 9d1877261228a721111eba9945db3b870c9d87fe) * Fix pytest --------- Co-authored-by: majorosdonat --- airflow/utils/log/file_task_handler.py | 6 ++- .../endpoints/test_log_endpoint.py | 23 ++++----- .../amazon/aws/log/test_s3_task_handler.py | 6 +-- .../google/cloud/log/test_gcs_task_handler.py | 7 +-- .../azure/log/test_wasb_task_handler.py | 17 +++---- tests/utils/log/test_log_reader.py | 51 +++++++------------ tests/utils/test_log_handlers.py | 14 +++-- 7 files changed, 55 insertions(+), 69 deletions(-) diff --git a/airflow/utils/log/file_task_handler.py b/airflow/utils/log/file_task_handler.py index e99ffae0c94d8..9eb55c707f180 100644 --- a/airflow/utils/log/file_task_handler.py +++ b/airflow/utils/log/file_task_handler.py @@ -416,7 +416,11 @@ def _read( ) ) log_pos = len(logs) - messages = "".join([f"*** {x}\n" for x in messages_list]) + # Log message source details are grouped: they are not relevant for most users and can + # distract them from finding the root cause of their errors + messages = " INFO - ::group::Log message source details\n" + messages += "".join([f"*** {x}\n" for x in messages_list]) + messages += " INFO - ::endgroup::\n" end_of_log = ti.try_number != try_number or ti.state not in ( TaskInstanceState.RUNNING, TaskInstanceState.DEFERRED, diff --git a/tests/api_connexion/endpoints/test_log_endpoint.py b/tests/api_connexion/endpoints/test_log_endpoint.py index 93ad2cec4b051..b0f265ec858df 100644 --- a/tests/api_connexion/endpoints/test_log_endpoint.py +++ b/tests/api_connexion/endpoints/test_log_endpoint.py @@ -188,10 +188,10 @@ def test_should_respond_200_json(self, try_number): ) expected_filename = f"{self.log_dir}/dag_id={self.DAG_ID}/run_id={self.RUN_ID}/task_id={self.TASK_ID}/attempt={try_number}.log" log_content = "Log for testing." if try_number == 1 else "Log for testing 2." - assert ( - response.json["content"] - == f"[('localhost', '*** Found local files:\\n*** * {expected_filename}\\n{log_content}')]" - ) + assert "[('localhost'," in response.json["content"] + assert f"*** Found local files:\\n*** * {expected_filename}\\n" in response.json["content"] + assert f"{log_content}')]" in response.json["content"] + info = serializer.loads(response.json["continuation_token"]) assert info == {"end_of_log": True, "log_pos": 16 if try_number == 1 else 18} assert 200 == response.status_code @@ -244,11 +244,9 @@ def test_should_respond_200_text_plain( assert 200 == response.status_code log_content = "Log for testing." if try_number == 1 else "Log for testing 2." - - assert ( - response.data.decode("utf-8") - == f"localhost\n*** Found local files:\n*** * {expected_filename}\n{log_content}\n" - ) + assert "localhost\n" in response.data.decode("utf-8") + assert f"*** Found local files:\n*** * {expected_filename}\n" in response.data.decode("utf-8") + assert f"{log_content}\n" in response.data.decode("utf-8") @pytest.mark.parametrize( "request_url, expected_filename, extra_query_string, try_number", @@ -302,10 +300,9 @@ def test_get_logs_of_removed_task(self, request_url, expected_filename, extra_qu assert 200 == response.status_code log_content = "Log for testing." if try_number == 1 else "Log for testing 2." - assert ( - response.data.decode("utf-8") - == f"localhost\n*** Found local files:\n*** * {expected_filename}\n{log_content}\n" - ) + assert "localhost\n" in response.data.decode("utf-8") + assert f"*** Found local files:\n*** * {expected_filename}\n" in response.data.decode("utf-8") + assert f"{log_content}\n" in response.data.decode("utf-8") @pytest.mark.parametrize("try_number", [1, 2]) def test_get_logs_response_with_ti_equal_to_none(self, try_number): diff --git a/tests/providers/amazon/aws/log/test_s3_task_handler.py b/tests/providers/amazon/aws/log/test_s3_task_handler.py index 3412011eb4ef4..7b799a5628971 100644 --- a/tests/providers/amazon/aws/log/test_s3_task_handler.py +++ b/tests/providers/amazon/aws/log/test_s3_task_handler.py @@ -127,8 +127,8 @@ def test_read(self): ti.state = TaskInstanceState.SUCCESS log, metadata = self.s3_task_handler.read(ti) actual = log[0][0][-1] - expected = "*** Found logs in s3:\n*** * s3://bucket/remote/log/location/1.log\nLog line" - assert actual == expected + assert "*** Found logs in s3:\n*** * s3://bucket/remote/log/location/1.log\n" in actual + assert actual.endswith("Log line") assert metadata == [{"end_of_log": True, "log_pos": 8}] def test_read_when_s3_log_missing(self): @@ -140,7 +140,7 @@ def test_read_when_s3_log_missing(self): assert len(log) == len(metadata) actual = log[0][0][-1] expected = "*** No logs found on s3 for ti=\n" - assert actual == expected + assert expected in actual assert {"end_of_log": True, "log_pos": 0} == metadata[0] def test_s3_read_when_log_missing(self): diff --git a/tests/providers/google/cloud/log/test_gcs_task_handler.py b/tests/providers/google/cloud/log/test_gcs_task_handler.py index a860e52e1524f..2c961fac4c363 100644 --- a/tests/providers/google/cloud/log/test_gcs_task_handler.py +++ b/tests/providers/google/cloud/log/test_gcs_task_handler.py @@ -106,7 +106,8 @@ def test_should_read_logs_from_remote(self, mock_blob, mock_client, mock_creds, mock_blob.from_string.assert_called_once_with( "gs://bucket/remote/log/location/1.log", mock_client.return_value ) - assert logs == "*** Found remote logs:\n*** * gs://bucket/remote/log/location/1.log\nCONTENT" + assert "*** Found remote logs:\n*** * gs://bucket/remote/log/location/1.log\n" in logs + assert logs.endswith("CONTENT") assert {"end_of_log": True, "log_pos": 7} == metadata @mock.patch( @@ -126,13 +127,13 @@ def test_should_read_from_local_on_logs_read_error(self, mock_blob, mock_client, ti.state = TaskInstanceState.SUCCESS log, metadata = self.gcs_task_handler._read(ti, self.ti.try_number) - assert log == ( + assert ( "*** Found remote logs:\n" "*** * gs://bucket/remote/log/location/1.log\n" "*** Unable to read remote log Failed to connect\n" "*** Found local files:\n" f"*** * {self.gcs_task_handler.local_base}/1.log\n" - ) + ) in log assert metadata == {"end_of_log": True, "log_pos": 0} mock_blob.from_string.assert_called_once_with( "gs://bucket/remote/log/location/1.log", mock_client.return_value diff --git a/tests/providers/microsoft/azure/log/test_wasb_task_handler.py b/tests/providers/microsoft/azure/log/test_wasb_task_handler.py index e74efe89e91fa..6ef1b99fd51f2 100644 --- a/tests/providers/microsoft/azure/log/test_wasb_task_handler.py +++ b/tests/providers/microsoft/azure/log/test_wasb_task_handler.py @@ -111,18 +111,13 @@ def test_wasb_read(self, mock_hook_cls, ti): assert self.wasb_task_handler.wasb_read(self.remote_log_location) == "Log line" ti = copy.copy(ti) ti.state = TaskInstanceState.SUCCESS - assert self.wasb_task_handler.read(ti) == ( - [ - [ - ( - "localhost", - "*** Found remote logs:\n" - "*** * https://wasb-container.blob.core.windows.net/abc/hello.log\nLog line", - ) - ] - ], - [{"end_of_log": True, "log_pos": 8}], + assert self.wasb_task_handler.read(ti)[0][0][0][0] == "localhost" + assert ( + "*** Found remote logs:\n*** * https://wasb-container.blob.core.windows.net/abc/hello.log\n" + in self.wasb_task_handler.read(ti)[0][0][0][1] ) + assert "Log line" in self.wasb_task_handler.read(ti)[0][0][0][1] + assert self.wasb_task_handler.read(ti)[1][0] == {"end_of_log": True, "log_pos": 8} @mock.patch( "airflow.providers.microsoft.azure.hooks.wasb.WasbHook", diff --git a/tests/utils/log/test_log_reader.py b/tests/utils/log/test_log_reader.py index 3216222909b7a..d4417bfba8220 100644 --- a/tests/utils/log/test_log_reader.py +++ b/tests/utils/log/test_log_reader.py @@ -128,8 +128,10 @@ def test_test_read_log_chunks_should_read_one_try(self): assert logs[0] == [ ( "localhost", + " INFO - ::group::Log message source details\n" "*** Found local files:\n" f"*** * {self.log_dir}/dag_log_reader/task_log_reader/2017-09-01T00.00.00+00.00/1.log\n" + " INFO - ::endgroup::\n" "try_number=1.", ) ] @@ -141,32 +143,13 @@ def test_test_read_log_chunks_should_read_all_files(self): ti.state = TaskInstanceState.SUCCESS logs, metadatas = task_log_reader.read_log_chunks(ti=ti, try_number=None, metadata={}) - assert logs == [ - [ - ( - "localhost", - "*** Found local files:\n" - f"*** * {self.log_dir}/dag_log_reader/task_log_reader/2017-09-01T00.00.00+00.00/1.log\n" - "try_number=1.", - ) - ], - [ - ( - "localhost", - "*** Found local files:\n" - f"*** * {self.log_dir}/dag_log_reader/task_log_reader/2017-09-01T00.00.00+00.00/2.log\n" - f"try_number=2.", - ) - ], - [ - ( - "localhost", - "*** Found local files:\n" - f"*** * {self.log_dir}/dag_log_reader/task_log_reader/2017-09-01T00.00.00+00.00/3.log\n" - f"try_number=3.", - ) - ], - ] + for i in range(0, 3): + assert logs[i][0][0] == "localhost" + assert ( + "*** Found local files:\n" + f"*** * {self.log_dir}/dag_log_reader/task_log_reader/2017-09-01T00.00.00+00.00/{i + 1}.log\n" + ) in logs[i][0][1] + assert f"try_number={i + 1}." in logs[i][0][1] assert metadatas == {"end_of_log": True, "log_pos": 13} def test_test_test_read_log_stream_should_read_one_try(self): @@ -175,9 +158,9 @@ def test_test_test_read_log_stream_should_read_one_try(self): ti.state = TaskInstanceState.SUCCESS stream = task_log_reader.read_log_stream(ti=ti, try_number=1, metadata={}) assert list(stream) == [ - "localhost\n*** Found local files:\n" + "localhost\n INFO - ::group::Log message source details\n*** Found local files:\n" f"*** * {self.log_dir}/dag_log_reader/task_log_reader/2017-09-01T00.00.00+00.00/1.log\n" - "try_number=1.\n" + " INFO - ::endgroup::\ntry_number=1.\n" ] def test_test_test_read_log_stream_should_read_all_logs(self): @@ -185,17 +168,17 @@ def test_test_test_read_log_stream_should_read_all_logs(self): self.ti.state = TaskInstanceState.SUCCESS # Ensure mocked instance is completed to return stream stream = task_log_reader.read_log_stream(ti=self.ti, try_number=None, metadata={}) assert list(stream) == [ - "localhost\n*** Found local files:\n" + "localhost\n INFO - ::group::Log message source details\n*** Found local files:\n" f"*** * {self.log_dir}/dag_log_reader/task_log_reader/2017-09-01T00.00.00+00.00/1.log\n" - "try_number=1." + " INFO - ::endgroup::\ntry_number=1." "\n", - "localhost\n*** Found local files:\n" + "localhost\n INFO - ::group::Log message source details\n*** Found local files:\n" f"*** * {self.log_dir}/dag_log_reader/task_log_reader/2017-09-01T00.00.00+00.00/2.log\n" - "try_number=2." + " INFO - ::endgroup::\ntry_number=2." "\n", - "localhost\n*** Found local files:\n" + "localhost\n INFO - ::group::Log message source details\n*** Found local files:\n" f"*** * {self.log_dir}/dag_log_reader/task_log_reader/2017-09-01T00.00.00+00.00/3.log\n" - "try_number=3." + " INFO - ::endgroup::\ntry_number=3." "\n", ] diff --git a/tests/utils/test_log_handlers.py b/tests/utils/test_log_handlers.py index da10034cb9370..d3651370d657e 100644 --- a/tests/utils/test_log_handlers.py +++ b/tests/utils/test_log_handlers.py @@ -272,7 +272,9 @@ def test__read_when_local(self, mock_read_local, create_task_instance): fth = FileTaskHandler("") actual = fth._read(ti=local_log_file_read, try_number=1) mock_read_local.assert_called_with(path) - assert actual == ("*** the messages\nthe log", {"end_of_log": True, "log_pos": 7}) + assert "*** the messages\n" in actual[0] + assert actual[0].endswith("the log") + assert actual[1] == {"end_of_log": True, "log_pos": 7} def test__read_from_local(self, tmp_path): """Tests the behavior of method _read_from_local""" @@ -333,9 +335,11 @@ def test__read_for_celery_executor_fallbacks_to_worker(self, create_task_instanc fth._read_from_logs_server = mock.Mock() fth._read_from_logs_server.return_value = ["this message"], ["this\nlog\ncontent"] - actual = fth._read(ti=ti, try_number=1) + actual_text, actual_meta = fth._read(ti=ti, try_number=1) fth._read_from_logs_server.assert_called_once() - assert actual == ("*** this message\nthis\nlog\ncontent", {"end_of_log": True, "log_pos": 16}) + assert "*** this message" in actual_text + assert "this\nlog\ncontent" in actual_text + assert actual_meta == {"end_of_log": True, "log_pos": 16} @pytest.mark.parametrize( "remote_logs, local_logs, served_logs_checked", @@ -379,7 +383,9 @@ def test__read_served_logs_checked_when_done_and_no_local_or_remote_logs( actual = fth._read(ti=ti, try_number=1) if served_logs_checked: fth._read_from_logs_server.assert_called_once() - assert actual == ("*** this message\nthis\nlog\ncontent", {"end_of_log": True, "log_pos": 16}) + assert "*** this message\n" in actual[0] + assert actual[0].endswith("this\nlog\ncontent") + assert actual[1] == {"end_of_log": True, "log_pos": 16} else: fth._read_from_logs_server.assert_not_called() assert actual[0] From 91e098097a3406750b507328bf60984757da77cf Mon Sep 17 00:00:00 2001 From: Jens Scheffler <95105677+jscheffl@users.noreply.github.com> Date: Sun, 17 Nov 2024 17:55:26 +0100 Subject: [PATCH 20/44] Update v2-10-test constraints (#44113) * Update v2-10-test constraints * Bump minimum version of open-telemetry (#43809) The min version of open-telemetry we used was pretty old and that old open-telemetry had different package structure that caused an issue with cache invalidation (see #43770). Bumping it might help resolvers in uv and PyPI to avoid downgrading the versions as well as avoid caching issues in similar scenarios. --------- Co-authored-by: Jarek Potiuk --- .pre-commit-config.yaml | 6 +++--- chart/values.schema.json | 2 +- chart/values.yaml | 2 +- clients/python/pyproject.toml | 2 +- docker_tests/requirements.txt | 2 +- hatch_build.py | 4 ++-- pyproject.toml | 6 +++--- 7 files changed, 12 insertions(+), 12 deletions(-) diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index c6a19314521b8..bb6c37ca4d598 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -467,21 +467,21 @@ repos: files: ^docs/apache-airflow/extra-packages-ref\.rst$|^hatch_build.py pass_filenames: false entry: ./scripts/ci/pre_commit/check_extra_packages_ref.py - additional_dependencies: ['rich>=12.4.4', 'hatchling==1.25.0', 'tabulate'] + additional_dependencies: ['rich>=12.4.4', 'hatchling==1.26.3', 'tabulate'] - id: check-hatch-build-order name: Check order of dependencies in hatch_build.py language: python files: ^hatch_build.py$ pass_filenames: false entry: ./scripts/ci/pre_commit/check_order_hatch_build.py - additional_dependencies: ['rich>=12.4.4', 'hatchling==1.25.0'] + additional_dependencies: ['rich>=12.4.4', 'hatchling==1.26.3'] - id: update-extras name: Update extras in documentation entry: ./scripts/ci/pre_commit/insert_extras.py language: python files: ^contributing-docs/12_airflow_dependencies_and_extras.rst$|^INSTALL$|^airflow/providers/.*/provider\.yaml$|^Dockerfile.* pass_filenames: false - additional_dependencies: ['rich>=12.4.4', 'hatchling==1.25.0'] + additional_dependencies: ['rich>=12.4.4', 'hatchling==1.26.3'] - id: check-extras-order name: Check order of extras in Dockerfile entry: ./scripts/ci/pre_commit/check_order_dockerfile_extras.py diff --git a/chart/values.schema.json b/chart/values.schema.json index 49cadfbb64118..4aea4ab7c8915 100644 --- a/chart/values.schema.json +++ b/chart/values.schema.json @@ -671,7 +671,7 @@ "tag": { "description": "The StatsD image tag.", "type": "string", - "default": "v0.27.2" + "default": "v0.28.0" }, "pullPolicy": { "description": "The StatsD image pull policy.", diff --git a/chart/values.yaml b/chart/values.yaml index 13f7f455ebb74..ef242b4cc8de9 100644 --- a/chart/values.yaml +++ b/chart/values.yaml @@ -105,7 +105,7 @@ images: pullPolicy: IfNotPresent statsd: repository: quay.io/prometheus/statsd-exporter - tag: v0.27.2 + tag: v0.28.0 pullPolicy: IfNotPresent redis: repository: redis diff --git a/clients/python/pyproject.toml b/clients/python/pyproject.toml index 1a5ccdc9e2b63..6ae2b49da6b28 100644 --- a/clients/python/pyproject.toml +++ b/clients/python/pyproject.toml @@ -16,7 +16,7 @@ # under the License. [build-system] -requires = ["hatchling==1.25.0"] +requires = ["hatchling==1.26.3"] build-backend = "hatchling.build" [project] diff --git a/docker_tests/requirements.txt b/docker_tests/requirements.txt index 4f62686ab445e..bcd9cd0655bb4 100644 --- a/docker_tests/requirements.txt +++ b/docker_tests/requirements.txt @@ -3,4 +3,4 @@ pytest-xdist # Requests 3 if it will be released, will be heavily breaking. requests>=2.27.0,<3 python-on-whales>=0.70.0 -hatchling==1.25.0 +hatchling==1.26.3 diff --git a/hatch_build.py b/hatch_build.py index 971b71f49bee4..ab7af6f13f08b 100644 --- a/hatch_build.py +++ b/hatch_build.py @@ -465,8 +465,8 @@ "marshmallow-oneofschema>=2.0.1", "mdit-py-plugins>=0.3.0", "methodtools>=0.4.7", - "opentelemetry-api>=1.15.0", - "opentelemetry-exporter-otlp>=1.15.0", + "opentelemetry-api>=1.24.0", + "opentelemetry-exporter-otlp>=1.24.0", "packaging>=23.0", "pathspec>=0.9.0", 'pendulum>=2.1.2,<4.0;python_version<"3.12"', diff --git a/pyproject.toml b/pyproject.toml index c975d95c79f80..2fcf2e1486858 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -24,12 +24,12 @@ requires = [ "GitPython==3.1.43", "gitdb==4.0.11", - "hatchling==1.25.0", - "packaging==24.1", + "hatchling==1.26.3", + "packaging==24.2", "pathspec==0.12.1", "pluggy==1.5.0", "smmap==5.0.1", - "tomli==2.0.2; python_version < '3.11'", + "tomli==2.1.0; python_version < '3.11'", "trove-classifiers==2024.10.21.16", ] build-backend = "hatchling.build" From 8018bf7c97ace3a3092042e927827d4f7c5f0f82 Mon Sep 17 00:00:00 2001 From: Jens Scheffler <95105677+jscheffl@users.noreply.github.com> Date: Sun, 17 Nov 2024 22:19:31 +0100 Subject: [PATCH 21/44] Ensure priority weight is capped at 32-bit integer to prevent roll-over (#43611) (#44045) * Ensure priority weight is capped at 32-bit integer to prevent roll-over * Add newsfragment * Move range check post type check * Review feedback - consolidate to single implementation for now (cherry picked from commit ab529d13042c9a9c036cd4a03d04c9aa819adf34) --- airflow/models/abstractoperator.py | 15 +++++++++------ airflow/models/baseoperator.py | 5 ++++- airflow/utils/weight_rule.py | 12 ++++++++++++ .../priority-weight.rst | 6 ++++++ newsfragments/43611.significant.rst | 6 ++++++ tests/utils/test_weight_rule.py | 9 ++++++++- 6 files changed, 45 insertions(+), 8 deletions(-) create mode 100644 newsfragments/43611.significant.rst diff --git a/airflow/models/abstractoperator.py b/airflow/models/abstractoperator.py index 45eb3c5fff189..ec3d1f5309adb 100644 --- a/airflow/models/abstractoperator.py +++ b/airflow/models/abstractoperator.py @@ -40,7 +40,7 @@ from airflow.utils.task_group import MappedTaskGroup from airflow.utils.trigger_rule import TriggerRule from airflow.utils.types import NOTSET, ArgNotSet -from airflow.utils.weight_rule import WeightRule +from airflow.utils.weight_rule import WeightRule, db_safe_priority TaskStateChangeCallback = Callable[[Context], None] @@ -467,7 +467,7 @@ def priority_weight_total(self) -> int: ) if isinstance(self.weight_rule, _AbsolutePriorityWeightStrategy): - return self.priority_weight + return db_safe_priority(self.priority_weight) elif isinstance(self.weight_rule, _DownstreamPriorityWeightStrategy): upstream = False elif isinstance(self.weight_rule, _UpstreamPriorityWeightStrategy): @@ -476,10 +476,13 @@ def priority_weight_total(self) -> int: upstream = False dag = self.get_dag() if dag is None: - return self.priority_weight - return self.priority_weight + sum( - dag.task_dict[task_id].priority_weight - for task_id in self.get_flat_relative_ids(upstream=upstream) + return db_safe_priority(self.priority_weight) + return db_safe_priority( + self.priority_weight + + sum( + dag.task_dict[task_id].priority_weight + for task_id in self.get_flat_relative_ids(upstream=upstream) + ) ) @cached_property diff --git a/airflow/models/baseoperator.py b/airflow/models/baseoperator.py index 449678860f80b..11522060fe06a 100644 --- a/airflow/models/baseoperator.py +++ b/airflow/models/baseoperator.py @@ -656,6 +656,8 @@ class derived from this one results in the creation of a task object, This allows the executor to trigger higher priority tasks before others when things get backed up. Set priority_weight as a higher number for more important tasks. + As not all database engines support 64-bit integers, values are capped with 32-bit. + Valid range is from -2,147,483,648 to 2,147,483,647. :param weight_rule: weighting method used for the effective total priority weight of the task. Options are: ``{ downstream | upstream | absolute }`` default is ``downstream`` @@ -677,7 +679,8 @@ class derived from this one results in the creation of a task object, Additionally, when set to ``absolute``, there is bonus effect of significantly speeding up the task creation process as for very large DAGs. Options can be set as string or using the constants defined in - the static class ``airflow.utils.WeightRule`` + the static class ``airflow.utils.WeightRule``. + Irrespective of the weight rule, resulting priority values are capped with 32-bit. |experimental| Since 2.9.0, Airflow allows to define custom priority weight strategy, by creating a subclass of diff --git a/airflow/utils/weight_rule.py b/airflow/utils/weight_rule.py index a63358b0322ce..490bcfbe88843 100644 --- a/airflow/utils/weight_rule.py +++ b/airflow/utils/weight_rule.py @@ -21,6 +21,18 @@ import methodtools +# Databases do not support arbitrary precision integers, so we need to limit the range of priority weights. +# postgres: -2147483648 to +2147483647 (see https://www.postgresql.org/docs/current/datatype-numeric.html) +# mysql: -2147483648 to +2147483647 (see https://dev.mysql.com/doc/refman/8.4/en/integer-types.html) +# sqlite: -9223372036854775808 to +9223372036854775807 (see https://sqlite.org/datatype3.html) +DB_SAFE_MINIMUM = -2147483648 +DB_SAFE_MAXIMUM = 2147483647 + + +def db_safe_priority(priority_weight: int) -> int: + """Convert priority weight to a safe value for the database.""" + return max(DB_SAFE_MINIMUM, min(DB_SAFE_MAXIMUM, priority_weight)) + class WeightRule(str, Enum): """Weight rules.""" diff --git a/docs/apache-airflow/administration-and-deployment/priority-weight.rst b/docs/apache-airflow/administration-and-deployment/priority-weight.rst index dd61d25fcd4ee..7bdeff645026c 100644 --- a/docs/apache-airflow/administration-and-deployment/priority-weight.rst +++ b/docs/apache-airflow/administration-and-deployment/priority-weight.rst @@ -63,6 +63,12 @@ Below are the weighting methods. By default, Airflow's weighting method is ``dow The ``priority_weight`` parameter can be used in conjunction with :ref:`concepts:pool`. +.. note:: + + As most database engines are using 32-bit for integers, the maximum value for any calculated or + defined ``priority_weight`` is 2,147,483,647 and the minimum value is -2,147,483,648. + + Custom Weight Rule ------------------ diff --git a/newsfragments/43611.significant.rst b/newsfragments/43611.significant.rst new file mode 100644 index 0000000000000..e25fb2a5bba4b --- /dev/null +++ b/newsfragments/43611.significant.rst @@ -0,0 +1,6 @@ +TaskInstance ``priority_weight`` is capped in 32-bit signed integer ranges. + +Some database engines are limited to 32-bit integer values. As some users reported errors in +weight rolled-over to negative values, we decided to cap the value to the 32-bit integer. Even +if internally in python smaller or larger values to 64 bit are supported, ``priority_weight`` is +capped and only storing values from -2147483648 to 2147483647. diff --git a/tests/utils/test_weight_rule.py b/tests/utils/test_weight_rule.py index 73abafe782b86..387bb9b09e469 100644 --- a/tests/utils/test_weight_rule.py +++ b/tests/utils/test_weight_rule.py @@ -19,7 +19,14 @@ import pytest -from airflow.utils.weight_rule import WeightRule +from airflow.utils.weight_rule import DB_SAFE_MAXIMUM, DB_SAFE_MINIMUM, WeightRule, db_safe_priority + + +def test_db_safe_priority(): + assert db_safe_priority(1) == 1 + assert db_safe_priority(-1) == -1 + assert db_safe_priority(9999999999) == DB_SAFE_MAXIMUM + assert db_safe_priority(-9999999999) == DB_SAFE_MINIMUM class TestWeightRule: From bef7a71d25f00c18457318e65b9f400e59d0e06b Mon Sep 17 00:00:00 2001 From: Pierre Jeambrun Date: Mon, 18 Nov 2024 21:21:08 +0800 Subject: [PATCH 22/44] get_task_instance_try_details API returns TaskInstanceHistory schema (#43830) (#44133) * Update v1.yaml these - get_task_instance_try_details - get_mapped_task_instance_try_details - get_task_instance_tries - get_mapped_task_instance_tries are actually returning TaskInstanceHistory * Update v1.yaml * dummy change * revert "dummy change" * Update api-generated.ts * Update api-generated.ts * Update api-generated.ts * Update api-generated.ts * changes to v1.yaml * Update api-generated.ts * removing execution_date --------- Co-authored-by: kandharvishnuu <148410552+kandharvishnuu@users.noreply.github.com> (cherry picked from commit 6f02fdbe2e917d20482fd31c1b090b5d6d880320) Co-authored-by: kandharvishnu <46064835+kandharvishnu@users.noreply.github.com> --- airflow/api_connexion/openapi/v1.yaml | 96 +++++++++++++++++++- airflow/www/static/js/types/api-generated.ts | 65 ++++++++++++- 2 files changed, 153 insertions(+), 8 deletions(-) diff --git a/airflow/api_connexion/openapi/v1.yaml b/airflow/api_connexion/openapi/v1.yaml index 6edfce9475946..ed350dd95d6fc 100644 --- a/airflow/api_connexion/openapi/v1.yaml +++ b/airflow/api_connexion/openapi/v1.yaml @@ -1743,7 +1743,7 @@ paths: content: application/json: schema: - $ref: "#/components/schemas/TaskInstance" + $ref: "#/components/schemas/TaskInstanceHistory" "401": $ref: "#/components/responses/Unauthenticated" "403": @@ -1774,7 +1774,7 @@ paths: content: application/json: schema: - $ref: "#/components/schemas/TaskInstanceCollection" + $ref: "#/components/schemas/TaskInstanceHistoryCollection" "401": $ref: "#/components/responses/Unauthenticated" "403": @@ -1806,7 +1806,7 @@ paths: content: application/json: schema: - $ref: "#/components/schemas/TaskInstanceCollection" + $ref: "#/components/schemas/TaskInstanceHistoryCollection" "401": $ref: "#/components/responses/Unauthenticated" "403": @@ -1836,7 +1836,7 @@ paths: content: application/json: schema: - $ref: "#/components/schemas/TaskInstance" + $ref: "#/components/schemas/TaskInstanceHistory" "401": $ref: "#/components/responses/Unauthenticated" "403": @@ -4021,7 +4021,95 @@ components: items: $ref: "#/components/schemas/TaskInstance" - $ref: "#/components/schemas/CollectionInfo" + TaskInstanceHistory: + type: object + properties: + task_id: + type: string + task_display_name: + type: string + description: | + Human centric display text for the task. + *New in version 2.9.0* + dag_id: + type: string + dag_run_id: + type: string + description: | + The DagRun ID for this task instance + + *New in version 2.3.0* + start_date: + type: string + format: datetime + nullable: true + end_date: + type: string + format: datetime + nullable: true + duration: + type: number + nullable: true + state: + $ref: "#/components/schemas/TaskState" + try_number: + type: integer + map_index: + type: integer + max_tries: + type: integer + hostname: + type: string + unixname: + type: string + pool: + type: string + pool_slots: + type: integer + queue: + type: string + nullable: true + priority_weight: + type: integer + nullable: true + operator: + type: string + nullable: true + description: | + *Changed in version 2.1.1*: Field becomes nullable. + queued_when: + type: string + nullable: true + description: | + The datetime that the task enter the state QUEUE, also known as queue_at + pid: + type: integer + nullable: true + executor: + type: string + nullable: true + description: | + Executor the task is configured to run on or None (which indicates the default executor) + + *New in version 2.10.0* + executor_config: + type: string + + TaskInstanceHistoryCollection: + type: object + description: | + Collection of task instances . + + *Changed in version 2.1.0*: 'total_entries' field is added. + allOf: + - type: object + properties: + task_instances_history: + type: array + items: + $ref: "#/components/schemas/TaskInstanceHistory" + - $ref: "#/components/schemas/CollectionInfo" TaskInstanceReference: type: object properties: diff --git a/airflow/www/static/js/types/api-generated.ts b/airflow/www/static/js/types/api-generated.ts index 3f4935fa6cede..2da17d2981d03 100644 --- a/airflow/www/static/js/types/api-generated.ts +++ b/airflow/www/static/js/types/api-generated.ts @@ -1593,6 +1593,57 @@ export interface components { TaskInstanceCollection: { task_instances?: components["schemas"]["TaskInstance"][]; } & components["schemas"]["CollectionInfo"]; + TaskInstanceHistory: { + task_id?: string; + /** + * @description Human centric display text for the task. + * + * *New in version 2.9.0* + */ + task_display_name?: string; + dag_id?: string; + /** + * @description The DagRun ID for this task instance + * + * *New in version 2.3.0* + */ + dag_run_id?: string; + /** Format: datetime */ + start_date?: string | null; + /** Format: datetime */ + end_date?: string | null; + duration?: number | null; + state?: components["schemas"]["TaskState"]; + try_number?: number; + map_index?: number; + max_tries?: number; + hostname?: string; + unixname?: string; + pool?: string; + pool_slots?: number; + queue?: string | null; + priority_weight?: number | null; + /** @description *Changed in version 2.1.1*: Field becomes nullable. */ + operator?: string | null; + /** @description The datetime that the task enter the state QUEUE, also known as queue_at */ + queued_when?: string | null; + pid?: number | null; + /** + * @description Executor the task is configured to run on or None (which indicates the default executor) + * + * *New in version 2.10.0* + */ + executor?: string | null; + executor_config?: string; + }; + /** + * @description Collection of task instances . + * + * *Changed in version 2.1.0*: 'total_entries' field is added. + */ + TaskInstanceHistoryCollection: { + task_instances_history?: components["schemas"]["TaskInstanceHistory"][]; + } & components["schemas"]["CollectionInfo"]; TaskInstanceReference: { /** @description The task ID. */ task_id?: string; @@ -4355,7 +4406,7 @@ export interface operations { /** Success. */ 200: { content: { - "application/json": components["schemas"]["TaskInstance"]; + "application/json": components["schemas"]["TaskInstanceHistory"]; }; }; 401: components["responses"]["Unauthenticated"]; @@ -4396,7 +4447,7 @@ export interface operations { /** Success. */ 200: { content: { - "application/json": components["schemas"]["TaskInstanceCollection"]; + "application/json": components["schemas"]["TaskInstanceHistoryCollection"]; }; }; 401: components["responses"]["Unauthenticated"]; @@ -4439,7 +4490,7 @@ export interface operations { /** Success. */ 200: { content: { - "application/json": components["schemas"]["TaskInstanceCollection"]; + "application/json": components["schemas"]["TaskInstanceHistoryCollection"]; }; }; 401: components["responses"]["Unauthenticated"]; @@ -4471,7 +4522,7 @@ export interface operations { /** Success. */ 200: { content: { - "application/json": components["schemas"]["TaskInstance"]; + "application/json": components["schemas"]["TaskInstanceHistory"]; }; }; 401: components["responses"]["Unauthenticated"]; @@ -5554,6 +5605,12 @@ export type TaskInstance = CamelCasedPropertiesDeep< export type TaskInstanceCollection = CamelCasedPropertiesDeep< components["schemas"]["TaskInstanceCollection"] >; +export type TaskInstanceHistory = CamelCasedPropertiesDeep< + components["schemas"]["TaskInstanceHistory"] +>; +export type TaskInstanceHistoryCollection = CamelCasedPropertiesDeep< + components["schemas"]["TaskInstanceHistoryCollection"] +>; export type TaskInstanceReference = CamelCasedPropertiesDeep< components["schemas"]["TaskInstanceReference"] >; From 86e0da87c8e2294be4737c222eeff56758638658 Mon Sep 17 00:00:00 2001 From: "github-actions[bot]" <41898282+github-actions[bot]@users.noreply.github.com> Date: Tue, 19 Nov 2024 03:35:06 +0000 Subject: [PATCH 23/44] [v2-10-test] suppress the warnings where we check for sensitive values (#44148) (#44167) (cherry picked from commit 9eaeb1c3098e364f940dbbf36e8f7fc72a262eee) https: //github.com/apache/airflow/pull/44061#issuecomment-2480320259 Co-authored-by: Zach Liu --- airflow/configuration.py | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/airflow/configuration.py b/airflow/configuration.py index 81eb0fc725344..afb4b5f3808b6 100644 --- a/airflow/configuration.py +++ b/airflow/configuration.py @@ -856,7 +856,8 @@ def mask_secrets(self): for section, key in self.sensitive_config_values: try: - value = self.get(section, key, suppress_warnings=True) + with self.suppress_future_warnings(): + value = self.get(section, key, suppress_warnings=True) except AirflowConfigException: log.debug( "Could not retrieve value from section %s, for key %s. Skipping redaction of this conf.", From 1738256e2ead0ebfc88b399183f45fae67adefeb Mon Sep 17 00:00:00 2001 From: "github-actions[bot]" <41898282+github-actions[bot]@users.noreply.github.com> Date: Tue, 19 Nov 2024 15:45:25 +0000 Subject: [PATCH 24/44] [v2-10-test] Exclude Scarf Usage Data Collection in CI Environments (#44155) (#44184) Most of the the CI systems add "CI=true" env var. Refereces: - https://docs.pytest.org/en/stable/explanation/ci.html - https://docs.github.com/en/actions/writing-workflows/choosing-what-your-workflow-does/store-information-in-variables#default-environment-variables - https://docs.travis-ci.com/user/environment-variables/ - https://docs.gitlab.com/ee/ci/variables/predefined_variables.html - https://circleci.com/docs/variables/#built-in-environment-variables - https://www.jenkins.io/doc/book/pipeline/jenkinsfile/#using-environment-variables - https://adamj.eu/tech/2020/03/09/detect-if-your-tests-are-running-on-ci/ - https://github.com/The-Compiler/pytest-vw/blob/master/pytest_vw.py (cherry picked from commit 347a83afbd5f860b37f91d4726f615ef9b71f1c7) Co-authored-by: Kaxil Naik --- airflow/utils/usage_data_collection.py | 25 +++++++++++++++++++++++ tests/utils/test_usage_data_collection.py | 2 ++ 2 files changed, 27 insertions(+) diff --git a/airflow/utils/usage_data_collection.py b/airflow/utils/usage_data_collection.py index fe86a2da1cb50..3bdfb180fa912 100644 --- a/airflow/utils/usage_data_collection.py +++ b/airflow/utils/usage_data_collection.py @@ -25,6 +25,7 @@ from __future__ import annotations +import os import platform from urllib.parse import urlencode @@ -43,6 +44,10 @@ def usage_data_collection(): if _version_is_prerelease(airflow_version): return + # Exclude CI environments + if _is_ci_environ(): + return + scarf_domain = "https://apacheairflow.gateway.scarf.sh/scheduler" try: @@ -70,6 +75,26 @@ def _version_is_prerelease(version: str) -> bool: return parse(version).is_prerelease +def _is_ci_environ() -> bool: + """Return True if running in any known CI environment.""" + if os.getenv("CI") == "true": + # Generic CI variable set by many CI systems (GH Actions, Travis, GitLab, CircleCI, Jenkins, Heroku) + return True + + # Other CI variables set by specific CI systems + ci_env_vars = { + "CIRCLECI", # CircleCI + "CODEBUILD_BUILD_ID", # AWS CodeBuild + "GITHUB_ACTIONS", # GitHub Actions + "GITLAB_CI", # GitLab CI + "JENKINS_URL", # Jenkins + "TF_BUILD", # Azure Pipelines + "TRAVIS", # Travis CI + } + + return any(var in os.environ for var in ci_env_vars) + + def get_platform_info() -> tuple[str, str]: return platform.system(), platform.machine() diff --git a/tests/utils/test_usage_data_collection.py b/tests/utils/test_usage_data_collection.py index bc973672089c9..143bce39eca4d 100644 --- a/tests/utils/test_usage_data_collection.py +++ b/tests/utils/test_usage_data_collection.py @@ -43,12 +43,14 @@ def test_scarf_analytics_disabled(mock_get, is_enabled, is_prerelease): @mock.patch("airflow.settings.is_usage_data_collection_enabled", return_value=True) @mock.patch("airflow.utils.usage_data_collection._version_is_prerelease", return_value=False) +@mock.patch("airflow.utils.usage_data_collection._is_ci_environ", return_value=False) @mock.patch("airflow.utils.usage_data_collection.get_database_version", return_value="12.3") @mock.patch("airflow.utils.usage_data_collection.get_database_name", return_value="postgres") @mock.patch("httpx.get") def test_scarf_analytics( mock_get, mock_is_usage_data_collection_enabled, + mock_version_is_ci, mock_version_is_prerelease, get_database_version, get_database_name, From a3e5e34ec5f408a435737a8f8ed30c3d814d166c Mon Sep 17 00:00:00 2001 From: Jens Scheffler <95105677+jscheffl@users.noreply.github.com> Date: Tue, 19 Nov 2024 21:06:58 +0100 Subject: [PATCH 25/44] [v2-10-test] Re-queue tassk when they are stuck in queued (#43520) (#44158) * [v2-10-test] Re-queue tassk when they are stuck in queued (#43520) The old "stuck in queued" logic just failed the tasks. Now we requeue them. We accomplish this by revoking the task from executor and setting state to scheduled. We'll re-queue it up to 2 times. Number of times is configurable by hidden config. We added a method to base executor revoke_task because, it's a discrete operation that is required for this feature, and it might be useful in other cases e.g. when detecting as zombies etc. We set state to failed or scheduled directly from scheduler (rather than sending through the event buffer) because event buffer makes more sense for handling external events -- why round trip through the executor and back to scheduler when scheduler is initiating the action? Anyway this avoids having to deal with "state mismatch" issues when processing events. --------- (cherry picked from commit a41feeb5aedad842be2b0f954e0be30c767dbc5e) Co-authored-by: Daniel Imberman Co-authored-by: Daniel Standish <15932138+dstandish@users.noreply.github.com> Co-authored-by: Jed Cunningham <66968678+jedcunningham@users.noreply.github.com> * fix test_handle_stuck_queued_tasks_multiple_attempts (#44093) --------- Co-authored-by: Daniel Imberman Co-authored-by: Daniel Standish <15932138+dstandish@users.noreply.github.com> Co-authored-by: Jed Cunningham <66968678+jedcunningham@users.noreply.github.com> Co-authored-by: GPK --- airflow/executors/base_executor.py | 26 ++++- airflow/jobs/scheduler_job_runner.py | 161 +++++++++++++++++++++------ docs/spelling_wordlist.txt | 1 + tests/jobs/test_scheduler_job.py | 128 +++++++++++++++++++-- 4 files changed, 270 insertions(+), 46 deletions(-) diff --git a/airflow/executors/base_executor.py b/airflow/executors/base_executor.py index 57568af199710..5a5cf2d73f15d 100644 --- a/airflow/executors/base_executor.py +++ b/airflow/executors/base_executor.py @@ -26,6 +26,7 @@ from typing import TYPE_CHECKING, Any, List, Optional, Sequence, Tuple import pendulum +from deprecated import deprecated from airflow.cli.cli_config import DefaultHelpParser from airflow.configuration import conf @@ -545,7 +546,12 @@ def terminate(self): """Get called when the daemon receives a SIGTERM.""" raise NotImplementedError() - def cleanup_stuck_queued_tasks(self, tis: list[TaskInstance]) -> list[str]: # pragma: no cover + @deprecated( + reason="Replaced by function `revoke_task`.", + category=RemovedInAirflow3Warning, + action="ignore", + ) + def cleanup_stuck_queued_tasks(self, tis: list[TaskInstance]) -> list[str]: """ Handle remnants of tasks that were failed because they were stuck in queued. @@ -556,7 +562,23 @@ def cleanup_stuck_queued_tasks(self, tis: list[TaskInstance]) -> list[str]: # p :param tis: List of Task Instances to clean up :return: List of readable task instances for a warning message """ - raise NotImplementedError() + raise NotImplementedError + + def revoke_task(self, *, ti: TaskInstance): + """ + Attempt to remove task from executor. + + It should attempt to ensure that the task is no longer running on the worker, + and ensure that it is cleared out from internal data structures. + + It should *not* change the state of the task in airflow, or add any events + to the event buffer. + + It should not raise any error. + + :param ti: Task instance to remove + """ + raise NotImplementedError def try_adopt_task_instances(self, tis: Sequence[TaskInstance]) -> Sequence[TaskInstance]: """ diff --git a/airflow/jobs/scheduler_job_runner.py b/airflow/jobs/scheduler_job_runner.py index aa4e8d4f26aea..c9afd40f719ed 100644 --- a/airflow/jobs/scheduler_job_runner.py +++ b/airflow/jobs/scheduler_job_runner.py @@ -25,12 +25,14 @@ import time import warnings from collections import Counter, defaultdict, deque +from contextlib import suppress from dataclasses import dataclass from datetime import timedelta from functools import lru_cache, partial from pathlib import Path from typing import TYPE_CHECKING, Any, Callable, Collection, Iterable, Iterator +from deprecated import deprecated from sqlalchemy import and_, delete, func, not_, or_, select, text, update from sqlalchemy.exc import OperationalError from sqlalchemy.orm import lazyload, load_only, make_transient, selectinload @@ -97,6 +99,9 @@ DR = DagRun DM = DagModel +TASK_STUCK_IN_QUEUED_RESCHEDULE_EVENT = "stuck in queued reschedule" +""":meta private:""" + @dataclass class ConcurrencyMap: @@ -228,6 +233,13 @@ def __init__( stalled_task_timeout, task_adoption_timeout, worker_pods_pending_timeout, task_queued_timeout ) + # this param is intentionally undocumented + self._num_stuck_queued_retries = conf.getint( + section="scheduler", + key="num_stuck_in_queued_retries", + fallback=2, + ) + self.do_pickle = do_pickle if log: @@ -1093,7 +1105,7 @@ def _run_scheduler_loop(self) -> None: timers.call_regular_interval( conf.getfloat("scheduler", "task_queued_timeout_check_interval"), - self._fail_tasks_stuck_in_queued, + self._handle_tasks_stuck_in_queued, ) timers.call_regular_interval( @@ -1141,6 +1153,7 @@ def _run_scheduler_loop(self) -> None: for executor in self.job.executors: try: # this is backcompat check if executor does not inherit from BaseExecutor + # todo: remove in airflow 3.0 if not hasattr(executor, "_task_event_logs"): continue with create_session() as session: @@ -1772,48 +1785,132 @@ def _send_sla_callbacks_to_processor(self, dag: DAG) -> None: self.job.executor.send_callback(request) @provide_session - def _fail_tasks_stuck_in_queued(self, session: Session = NEW_SESSION) -> None: + def _handle_tasks_stuck_in_queued(self, session: Session = NEW_SESSION) -> None: """ - Mark tasks stuck in queued for longer than `task_queued_timeout` as failed. + Handle the scenario where a task is queued for longer than `task_queued_timeout`. Tasks can get stuck in queued for a wide variety of reasons (e.g. celery loses track of a task, a cluster can't further scale up its workers, etc.), but tasks - should not be stuck in queued for a long time. This will mark tasks stuck in - queued for longer than `self._task_queued_timeout` as failed. If the task has - available retries, it will be retried. + should not be stuck in queued for a long time. + + We will attempt to requeue the task (by revoking it from executor and setting to + scheduled) up to 2 times before failing the task. """ - self.log.debug("Calling SchedulerJob._fail_tasks_stuck_in_queued method") + tasks_stuck_in_queued = self._get_tis_stuck_in_queued(session) + for executor, stuck_tis in self._executor_to_tis(tasks_stuck_in_queued).items(): + try: + for ti in stuck_tis: + executor.revoke_task(ti=ti) + self._maybe_requeue_stuck_ti( + ti=ti, + session=session, + ) + except NotImplementedError: + # this block only gets entered if the executor has not implemented `revoke_task`. + # in which case, we try the fallback logic + # todo: remove the call to _stuck_in_queued_backcompat_logic in airflow 3.0. + # after 3.0, `cleanup_stuck_queued_tasks` will be removed, so we should + # just continue immediately. + self._stuck_in_queued_backcompat_logic(executor, stuck_tis) + continue - tasks_stuck_in_queued = session.scalars( + def _get_tis_stuck_in_queued(self, session) -> Iterable[TaskInstance]: + """Query db for TIs that are stuck in queued.""" + return session.scalars( select(TI).where( TI.state == TaskInstanceState.QUEUED, TI.queued_dttm < (timezone.utcnow() - timedelta(seconds=self._task_queued_timeout)), TI.queued_by_job_id == self.job.id, ) - ).all() + ) - for executor, stuck_tis in self._executor_to_tis(tasks_stuck_in_queued).items(): - try: - cleaned_up_task_instances = set(executor.cleanup_stuck_queued_tasks(tis=stuck_tis)) - for ti in stuck_tis: - if repr(ti) in cleaned_up_task_instances: - self.log.warning( - "Marking task instance %s stuck in queued as failed. " - "If the task instance has available retries, it will be retried.", - ti, - ) - session.add( - Log( - event="stuck in queued", - task_instance=ti.key, - extra=( - "Task will be marked as failed. If the task instance has " - "available retries, it will be retried." - ), - ) - ) - except NotImplementedError: - self.log.debug("Executor doesn't support cleanup of stuck queued tasks. Skipping.") + def _maybe_requeue_stuck_ti(self, *, ti, session): + """ + Requeue task if it has not been attempted too many times. + + Otherwise, fail it. + """ + num_times_stuck = self._get_num_times_stuck_in_queued(ti, session) + if num_times_stuck < self._num_stuck_queued_retries: + self.log.info("Task stuck in queued; will try to requeue. task_id=%s", ti.task_id) + session.add( + Log( + event=TASK_STUCK_IN_QUEUED_RESCHEDULE_EVENT, + task_instance=ti.key, + extra=( + f"Task was in queued state for longer than {self._task_queued_timeout} " + "seconds; task state will be set back to scheduled." + ), + ) + ) + self._reschedule_stuck_task(ti) + else: + self.log.info( + "Task requeue attempts exceeded max; marking failed. task_instance=%s", + ti, + ) + session.add( + Log( + event="stuck in queued tries exceeded", + task_instance=ti.key, + extra=f"Task was requeued more than {self._num_stuck_queued_retries} times and will be failed.", + ) + ) + ti.set_state(TaskInstanceState.FAILED, session=session) + + @deprecated( + reason="This is backcompat layer for older executor interface. Should be removed in 3.0", + category=RemovedInAirflow3Warning, + action="ignore", + ) + def _stuck_in_queued_backcompat_logic(self, executor, stuck_tis): + """ + Try to invoke stuck in queued cleanup for older executor interface. + + TODO: remove in airflow 3.0 + + Here we handle case where the executor pre-dates the interface change that + introduced `cleanup_tasks_stuck_in_queued` and deprecated `cleanup_stuck_queued_tasks`. + + """ + with suppress(NotImplementedError): + for ti_repr in executor.cleanup_stuck_queued_tasks(tis=stuck_tis): + self.log.warning( + "Task instance %s stuck in queued. Will be set to failed.", + ti_repr, + ) + + @provide_session + def _reschedule_stuck_task(self, ti, session=NEW_SESSION): + session.execute( + update(TI) + .where(TI.filter_for_tis([ti])) + .values( + state=TaskInstanceState.SCHEDULED, + queued_dttm=None, + ) + .execution_options(synchronize_session=False) + ) + + @provide_session + def _get_num_times_stuck_in_queued(self, ti: TaskInstance, session: Session = NEW_SESSION) -> int: + """ + Check the Log table to see how many times a taskinstance has been stuck in queued. + + We can then use this information to determine whether to reschedule a task or fail it. + """ + return ( + session.query(Log) + .where( + Log.task_id == ti.task_id, + Log.dag_id == ti.dag_id, + Log.run_id == ti.run_id, + Log.map_index == ti.map_index, + Log.try_number == ti.try_number, + Log.event == TASK_STUCK_IN_QUEUED_RESCHEDULE_EVENT, + ) + .count() + ) @provide_session def _emit_pool_metrics(self, session: Session = NEW_SESSION) -> None: @@ -2102,7 +2199,7 @@ def _orphan_unreferenced_datasets(self, session: Session = NEW_SESSION) -> None: updated_count = sum(self._set_orphaned(dataset) for dataset in orphaned_dataset_query) Stats.gauge("dataset.orphaned", updated_count) - def _executor_to_tis(self, tis: list[TaskInstance]) -> dict[BaseExecutor, list[TaskInstance]]: + def _executor_to_tis(self, tis: Iterable[TaskInstance]) -> dict[BaseExecutor, list[TaskInstance]]: """Organize TIs into lists per their respective executor.""" _executor_to_tis: defaultdict[BaseExecutor, list[TaskInstance]] = defaultdict(list) for ti in tis: diff --git a/docs/spelling_wordlist.txt b/docs/spelling_wordlist.txt index b027913e920a3..7d5bf9b8b6167 100644 --- a/docs/spelling_wordlist.txt +++ b/docs/spelling_wordlist.txt @@ -1360,6 +1360,7 @@ repos repr req reqs +requeued Reserialize reserialize reserialized diff --git a/tests/jobs/test_scheduler_job.py b/tests/jobs/test_scheduler_job.py index d65346857932d..97d3e9fe87d0b 100644 --- a/tests/jobs/test_scheduler_job.py +++ b/tests/jobs/test_scheduler_job.py @@ -28,6 +28,7 @@ from typing import Generator from unittest import mock from unittest.mock import MagicMock, PropertyMock, patch +from uuid import uuid4 import psutil import pytest @@ -55,6 +56,7 @@ from airflow.models.dagrun import DagRun from airflow.models.dataset import DatasetDagRunQueue, DatasetEvent, DatasetModel from airflow.models.db_callback_request import DbCallbackRequest +from airflow.models.log import Log from airflow.models.pool import Pool from airflow.models.serialized_dag import SerializedDagModel from airflow.models.taskinstance import SimpleTaskInstance, TaskInstance, TaskInstanceKey @@ -123,6 +125,19 @@ def load_examples(): # Patch the MockExecutor into the dict of known executors in the Loader +@contextlib.contextmanager +def _loader_mock(mock_executors): + with mock.patch("airflow.executors.executor_loader.ExecutorLoader.load_executor") as loader_mock: + # The executors are mocked, so cannot be loaded/imported. Mock load_executor and return the + # correct object for the given input executor name. + loader_mock.side_effect = lambda *x: { + ("default_exec",): mock_executors[0], + (None,): mock_executors[0], + ("secondary_exec",): mock_executors[1], + }[x] + yield + + @patch.dict( ExecutorLoader.executors, {MOCK_EXECUTOR: f"{MockExecutor.__module__}.{MockExecutor.__qualname__}"} ) @@ -2177,7 +2192,18 @@ def test_adopt_or_reset_orphaned_tasks_multiple_executors(self, dag_maker, mock_ # Second executor called for ti3 mock_executors[1].try_adopt_task_instances.assert_called_once_with([ti3]) - def test_fail_stuck_queued_tasks(self, dag_maker, session, mock_executors): + def test_handle_stuck_queued_tasks_backcompat(self, dag_maker, session, mock_executors): + """ + Verify backward compatibility of the executor interface w.r.t. stuck queued. + + Prior to #43520, scheduler called method `cleanup_stuck_queued_tasks`, which failed tis. + + After #43520, scheduler calls `cleanup_tasks_stuck_in_queued`, which requeues tis. + + At Airflow 3.0, we should remove backcompat support for this old function. But for now + we verify that we call it as a fallback. + """ + # todo: remove in airflow 3.0 with dag_maker("test_fail_stuck_queued_tasks_multiple_executors"): op1 = EmptyOperator(task_id="op1") op2 = EmptyOperator(task_id="op2", executor="default_exec") @@ -2194,26 +2220,102 @@ def test_fail_stuck_queued_tasks(self, dag_maker, session, mock_executors): scheduler_job = Job() job_runner = SchedulerJobRunner(job=scheduler_job, num_runs=0) job_runner._task_queued_timeout = 300 + mock_exec_1 = mock_executors[0] + mock_exec_2 = mock_executors[1] + mock_exec_1.revoke_task.side_effect = NotImplementedError + mock_exec_2.revoke_task.side_effect = NotImplementedError with mock.patch("airflow.executors.executor_loader.ExecutorLoader.load_executor") as loader_mock: # The executors are mocked, so cannot be loaded/imported. Mock load_executor and return the # correct object for the given input executor name. loader_mock.side_effect = lambda *x: { - ("default_exec",): mock_executors[0], - (None,): mock_executors[0], - ("secondary_exec",): mock_executors[1], + ("default_exec",): mock_exec_1, + (None,): mock_exec_1, + ("secondary_exec",): mock_exec_2, }[x] - job_runner._fail_tasks_stuck_in_queued() + job_runner._handle_tasks_stuck_in_queued() # Default executor is called for ti1 (no explicit executor override uses default) and ti2 (where we # explicitly marked that for execution by the default executor) try: - mock_executors[0].cleanup_stuck_queued_tasks.assert_called_once_with(tis=[ti1, ti2]) + mock_exec_1.cleanup_stuck_queued_tasks.assert_called_once_with(tis=[ti1, ti2]) except AssertionError: - mock_executors[0].cleanup_stuck_queued_tasks.assert_called_once_with(tis=[ti2, ti1]) - mock_executors[1].cleanup_stuck_queued_tasks.assert_called_once_with(tis=[ti3]) + mock_exec_1.cleanup_stuck_queued_tasks.assert_called_once_with(tis=[ti2, ti1]) + mock_exec_2.cleanup_stuck_queued_tasks.assert_called_once_with(tis=[ti3]) + + @conf_vars({("scheduler", "num_stuck_in_queued_retries"): "2"}) + def test_handle_stuck_queued_tasks_multiple_attempts(self, dag_maker, session, mock_executors): + """Verify that tasks stuck in queued will be rescheduled up to N times.""" + with dag_maker("test_fail_stuck_queued_tasks_multiple_executors"): + EmptyOperator(task_id="op1") + EmptyOperator(task_id="op2", executor="default_exec") + + def _queue_tasks(tis): + for ti in tis: + ti.state = "queued" + ti.queued_dttm = timezone.utcnow() + session.commit() + + run_id = str(uuid4()) + dr = dag_maker.create_dagrun(run_id=run_id) + + tis = dr.get_task_instances(session=session) + _queue_tasks(tis=tis) + scheduler_job = Job() + scheduler = SchedulerJobRunner(job=scheduler_job, num_runs=0) + # job_runner._reschedule_stuck_task = MagicMock() + scheduler._task_queued_timeout = -300 # always in violation of timeout + + with _loader_mock(mock_executors): + scheduler._handle_tasks_stuck_in_queued(session=session) + + # If the task gets stuck in queued once, we reset it to scheduled + tis = dr.get_task_instances(session=session) + assert [x.state for x in tis] == ["scheduled", "scheduled"] + assert [x.queued_dttm for x in tis] == [None, None] + + _queue_tasks(tis=tis) + log_events = [x.event for x in session.scalars(select(Log).where(Log.run_id == run_id)).all()] + assert log_events == [ + "stuck in queued reschedule", + "stuck in queued reschedule", + ] + + with _loader_mock(mock_executors): + scheduler._handle_tasks_stuck_in_queued(session=session) + session.commit() + + log_events = [x.event for x in session.scalars(select(Log).where(Log.run_id == run_id)).all()] + assert log_events == [ + "stuck in queued reschedule", + "stuck in queued reschedule", + "stuck in queued reschedule", + "stuck in queued reschedule", + ] + mock_executors[0].fail.assert_not_called() + tis = dr.get_task_instances(session=session) + assert [x.state for x in tis] == ["scheduled", "scheduled"] + _queue_tasks(tis=tis) + + with _loader_mock(mock_executors): + scheduler._handle_tasks_stuck_in_queued(session=session) + session.commit() + log_events = [x.event for x in session.scalars(select(Log).where(Log.run_id == run_id)).all()] + assert log_events == [ + "stuck in queued reschedule", + "stuck in queued reschedule", + "stuck in queued reschedule", + "stuck in queued reschedule", + "stuck in queued tries exceeded", + "stuck in queued tries exceeded", + ] + + mock_executors[0].fail.assert_not_called() # just demoing that we don't fail with executor method + states = [x.state for x in dr.get_task_instances(session=session)] + assert states == ["failed", "failed"] - def test_fail_stuck_queued_tasks_raises_not_implemented(self, dag_maker, session, caplog): + def test_revoke_task_not_imp_tolerated(self, dag_maker, session, caplog): + """Test that if executor no implement revoke_task then we don't blow up.""" with dag_maker("test_fail_stuck_queued_tasks"): op1 = EmptyOperator(task_id="op1") @@ -2224,12 +2326,14 @@ def test_fail_stuck_queued_tasks_raises_not_implemented(self, dag_maker, session session.commit() from airflow.executors.local_executor import LocalExecutor + assert "revoke_task" in BaseExecutor.__dict__ + # this is just verifying that LocalExecutor is good enough for this test + # in that it does not implement revoke_task + assert "revoke_task" not in LocalExecutor.__dict__ scheduler_job = Job(executor=LocalExecutor()) job_runner = SchedulerJobRunner(job=scheduler_job, num_runs=0) job_runner._task_queued_timeout = 300 - with caplog.at_level(logging.DEBUG): - job_runner._fail_tasks_stuck_in_queued() - assert "Executor doesn't support cleanup of stuck queued tasks. Skipping." in caplog.text + job_runner._handle_tasks_stuck_in_queued() @mock.patch("airflow.dag_processing.manager.DagFileProcessorAgent") def test_executor_end_called(self, mock_processor_agent, mock_executors): From dbde82f2231b2bb89edf5544a33ad76596bb9369 Mon Sep 17 00:00:00 2001 From: Jarek Potiuk Date: Sun, 24 Nov 2024 09:37:31 +0000 Subject: [PATCH 26/44] [v2-10-test] Only install eval-type-backport for Python < 3.10 (#44294) (#44315) The `eval-type-backport` is a tool to replace some of the controversial new type hints added with `from future imoport __annotations__` to "classic" type hint (| and list - into `Union` and `List`). This helps to battle some of the issues where Pydantic has troubles when they are used for classes that Pydantic uses. The library was initially added in #42196 but it was added for all Python versions - this change limits it only to Python < 3.10 (cherry picked from commit 29483384be5245a12ae1eb80fab364ddcbe41481) --- hatch_build.py | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/hatch_build.py b/hatch_build.py index ab7af6f13f08b..93627da0193b9 100644 --- a/hatch_build.py +++ b/hatch_build.py @@ -436,6 +436,12 @@ "cryptography>=41.0.0", "deprecated>=1.2.13", "dill>=0.2.2", + # Required for python 3.8 and 3.9 to work with new annotations styles. Check package + # description on PyPI for more details: https://pypi.org/project/eval-type-backport/ + # NOTE! THIS MIGHT BE REMOVED BEFORE WE RELEASE 2.10.4 if + # Pydantic 2.10.2 will add eval-type-backport as dependency for Python 3.8/3.9 + # see https://github.com/pydantic/pydantic/issues/10958 + 'eval-type-backport>=0.2.0;python_version<"3.10"', "flask-caching>=2.0.0", # Flask-Session 0.6 add new arguments into the SqlAlchemySessionInterface constructor as well as # all parameters now are mandatory which make AirflowDatabaseSessionInterface incompatible with this version. From 47a5092e2ede11b12361df12f0e663981b293db0 Mon Sep 17 00:00:00 2001 From: "github-actions[bot]" <41898282+github-actions[bot]@users.noreply.github.com> Date: Sun, 24 Nov 2024 11:37:10 +0000 Subject: [PATCH 27/44] [v2-10-test] Avoid grouping task instance stats by try_number for dynamic mapped tasks (#44300) (#44319) * [v2-10-test] Avoid grouping task instance stats by try_number for dynamic mapped tasks (#44300) (cherry picked from commit 5e52bd29abd690098ecf0701b8aab4792566eea3) Co-authored-by: Shahar Epstein <60007259+shahar1@users.noreply.github.com> * Update test_views_grid.py --------- Co-authored-by: Shahar Epstein <60007259+shahar1@users.noreply.github.com> --- airflow/www/views.py | 15 ++++++++-- newsfragments/44300.bugfix.rst | 1 + tests/www/views/test_views_grid.py | 44 ++++++++++++++++++++++++++++++ 3 files changed, 58 insertions(+), 2 deletions(-) create mode 100644 newsfragments/44300.bugfix.rst diff --git a/airflow/www/views.py b/airflow/www/views.py index bb88da2cdfa9c..222759e6e67ba 100644 --- a/airflow/www/views.py +++ b/airflow/www/views.py @@ -316,7 +316,10 @@ def dag_to_grid(dag: DagModel, dag_runs: Sequence[DagRun], session: Session) -> TaskInstance.task_id, TaskInstance.run_id, TaskInstance.state, - TaskInstance.try_number, + case( + (TaskInstance.map_index == -1, TaskInstance.try_number), + else_=None, + ).label("try_number"), func.min(TaskInstanceNote.content).label("note"), func.count(func.coalesce(TaskInstance.state, sqla.literal("no_status"))).label("state_count"), func.min(TaskInstance.queued_dttm).label("queued_dttm"), @@ -328,7 +331,15 @@ def dag_to_grid(dag: DagModel, dag_runs: Sequence[DagRun], session: Session) -> TaskInstance.dag_id == dag.dag_id, TaskInstance.run_id.in_([dag_run.run_id for dag_run in dag_runs]), ) - .group_by(TaskInstance.task_id, TaskInstance.run_id, TaskInstance.state, TaskInstance.try_number) + .group_by( + TaskInstance.task_id, + TaskInstance.run_id, + TaskInstance.state, + case( + (TaskInstance.map_index == -1, TaskInstance.try_number), + else_=None, + ), + ) .order_by(TaskInstance.task_id, TaskInstance.run_id) ) diff --git a/newsfragments/44300.bugfix.rst b/newsfragments/44300.bugfix.rst new file mode 100644 index 0000000000000..ffd4b07b2ab0d --- /dev/null +++ b/newsfragments/44300.bugfix.rst @@ -0,0 +1 @@ +Fix stats of dynamic mapped tasks after automatic retries of failed tasks diff --git a/tests/www/views/test_views_grid.py b/tests/www/views/test_views_grid.py index 0b82279880189..7cafb6a4c8e6e 100644 --- a/tests/www/views/test_views_grid.py +++ b/tests/www/views/test_views_grid.py @@ -517,3 +517,47 @@ def test_next_run_datasets_404(admin_client): resp = admin_client.get("/object/next_run_datasets/missingdag", follow_redirects=True) assert resp.status_code == 404, resp.json assert resp.json == {"error": "can't find dag missingdag"} + + +@pytest.mark.usefixtures("freeze_time_for_dagruns") +def test_dynamic_mapped_task_with_retries(admin_client, dag_with_runs: list[DagRun], session): + """ + Test a DAG with a dynamic mapped task with retries + """ + run1, run2 = dag_with_runs + + for ti in run1.task_instances: + ti.state = TaskInstanceState.SUCCESS + for ti in sorted(run2.task_instances, key=lambda ti: (ti.task_id, ti.map_index)): + if ti.task_id == "task1": + ti.state = TaskInstanceState.SUCCESS + elif ti.task_id == "group.mapped": + if ti.map_index == 0: + ti.state = TaskInstanceState.FAILED + ti.start_date = pendulum.DateTime(2021, 7, 1, 1, 0, 0, tzinfo=pendulum.UTC) + ti.end_date = pendulum.DateTime(2021, 7, 1, 1, 2, 3, tzinfo=pendulum.UTC) + elif ti.map_index == 1: + ti.try_number = 1 + ti.state = TaskInstanceState.SUCCESS + ti.start_date = pendulum.DateTime(2021, 7, 1, 2, 3, 4, tzinfo=pendulum.UTC) + ti.end_date = None + elif ti.map_index == 2: + ti.try_number = 2 + ti.state = TaskInstanceState.FAILED + ti.start_date = pendulum.DateTime(2021, 7, 1, 2, 3, 4, tzinfo=pendulum.UTC) + ti.end_date = None + elif ti.map_index == 3: + ti.try_number = 3 + ti.state = TaskInstanceState.SUCCESS + ti.start_date = pendulum.DateTime(2021, 7, 1, 2, 3, 4, tzinfo=pendulum.UTC) + ti.end_date = None + session.flush() + + resp = admin_client.get(f"/object/grid_data?dag_id={DAG_ID}", follow_redirects=True) + + assert resp.status_code == 200, resp.json + + assert resp.json["groups"]["children"][-1]["children"][-1]["instances"][-1]["mapped_states"] == { + "failed": 2, + "success": 2, + } From 88cbc3acbb3cade177607fd9fc381809f75c2c82 Mon Sep 17 00:00:00 2001 From: Karen Braganza Date: Wed, 27 Nov 2024 05:55:37 -0500 Subject: [PATCH 28/44] Check pool_slots on partial task import instead of execution (#39724) (#42693) Co-authored-by: Ryan Hatter <25823361+RNHTTR@users.noreply.github.com> Co-authored-by: Utkarsh Sharma --- airflow/decorators/base.py | 6 ++++++ airflow/models/baseoperator.py | 5 +++++ tests/models/test_mappedoperator.py | 9 +++++++++ 3 files changed, 20 insertions(+) diff --git a/airflow/decorators/base.py b/airflow/decorators/base.py index d743acbe50b2b..bcb64aaa6eb3c 100644 --- a/airflow/decorators/base.py +++ b/airflow/decorators/base.py @@ -457,6 +457,12 @@ def _expand(self, expand_input: ExpandInput, *, strict: bool) -> XComArg: end_date = timezone.convert_to_utc(partial_kwargs.pop("end_date", None)) if partial_kwargs.get("pool") is None: partial_kwargs["pool"] = Pool.DEFAULT_POOL_NAME + if "pool_slots" in partial_kwargs: + if partial_kwargs["pool_slots"] < 1: + dag_str = "" + if dag: + dag_str = f" in dag {dag.dag_id}" + raise ValueError(f"pool slots for {task_id}{dag_str} cannot be less than 1") partial_kwargs["retries"] = parse_retries(partial_kwargs.get("retries", DEFAULT_RETRIES)) partial_kwargs["retry_delay"] = coerce_timedelta( partial_kwargs.get("retry_delay", DEFAULT_RETRY_DELAY), diff --git a/airflow/models/baseoperator.py b/airflow/models/baseoperator.py index 11522060fe06a..773552184f103 100644 --- a/airflow/models/baseoperator.py +++ b/airflow/models/baseoperator.py @@ -365,6 +365,11 @@ def partial( partial_kwargs["end_date"] = timezone.convert_to_utc(partial_kwargs["end_date"]) if partial_kwargs["pool"] is None: partial_kwargs["pool"] = Pool.DEFAULT_POOL_NAME + if partial_kwargs["pool_slots"] < 1: + dag_str = "" + if dag: + dag_str = f" in dag {dag.dag_id}" + raise ValueError(f"pool slots for {task_id}{dag_str} cannot be less than 1") partial_kwargs["retries"] = parse_retries(partial_kwargs["retries"]) partial_kwargs["retry_delay"] = coerce_timedelta(partial_kwargs["retry_delay"], key="retry_delay") if partial_kwargs["max_retry_delay"] is not None: diff --git a/tests/models/test_mappedoperator.py b/tests/models/test_mappedoperator.py index 01991c0bb457d..cf547912fb924 100644 --- a/tests/models/test_mappedoperator.py +++ b/tests/models/test_mappedoperator.py @@ -221,6 +221,15 @@ def test_partial_on_class_invalid_ctor_args() -> None: MockOperator.partial(task_id="a", foo="bar", bar=2) +def test_partial_on_invalid_pool_slots_raises() -> None: + """Test that when we pass an invalid value to pool_slots in partial(), + + i.e. if the value is not an integer, an error is raised at import time.""" + + with pytest.raises(TypeError, match="'<' not supported between instances of 'str' and 'int'"): + MockOperator.partial(task_id="pool_slots_test", pool="test", pool_slots="a").expand(arg1=[1, 2, 3]) + + @pytest.mark.skip_if_database_isolation_mode # Does not work in db isolation mode @pytest.mark.parametrize( ["num_existing_tis", "expected"], From 581c2c249b9447db90225602cd491cb7627b1798 Mon Sep 17 00:00:00 2001 From: "github-actions[bot]" <41898282+github-actions[bot]@users.noreply.github.com> Date: Thu, 28 Nov 2024 04:51:34 +0100 Subject: [PATCH 29/44] [v2-10-test] Fix problem with inability to remove fields from Connection form (#40421) (#44442) (cherry picked from commit 14bfe39298a2361ae34eac840aebae84063306ee) Co-authored-by: Maksim --- airflow/www/views.py | 2 ++ tests/www/views/test_views_connection.py | 26 ++++++++++++++++++++++++ 2 files changed, 28 insertions(+) diff --git a/airflow/www/views.py b/airflow/www/views.py index 222759e6e67ba..92fa534f57191 100644 --- a/airflow/www/views.py +++ b/airflow/www/views.py @@ -4357,6 +4357,8 @@ def process_form(self, form, is_created): # value isn't an empty string. if value != "": extra[field_name] = value + elif field_name in extra: + del extra[field_name] if extra.keys(): sensitive_unchanged_keys = set() for key, value in extra.items(): diff --git a/tests/www/views/test_views_connection.py b/tests/www/views/test_views_connection.py index a209cdfc2be8a..c70e6d19d48d8 100644 --- a/tests/www/views/test_views_connection.py +++ b/tests/www/views/test_views_connection.py @@ -316,6 +316,32 @@ def test_process_form_extras_updates_sensitive_placeholder_unchanged( } +@mock.patch("airflow.utils.module_loading.import_string") +@mock.patch("airflow.providers_manager.ProvidersManager.hooks", new_callable=PropertyMock) +def test_process_form_extras_remove(mock_pm_hooks, mock_import_str): + """ + Test the remove value from field. + """ + # Testing parameters set in both extra and custom fields (connection updates). + mock_form = mock.Mock() + mock_form.data = { + "conn_type": "test4", + "conn_id": "extras_test4", + "extra": '{"extra__test4__remove_field": "remove_field_val3"}', + "extra__test4__remove_field": "", + } + + cmv = ConnectionModelView() + cmv._iter_extra_field_names_and_sensitivity = mock.Mock( + return_value=[("extra__test4__remove_field", "remove_field", False)] + ) + cmv.process_form(form=mock_form, is_created=True) + + assert json.loads(mock_form.extra.data) == { + "extra__test4__remove_field": "remove_field_val3", + } + + def test_duplicate_connection(admin_client): """Test Duplicate multiple connection with suffix""" conn1 = Connection( From 84d8a54e67c41000398b7abe3571ef0f0494924f Mon Sep 17 00:00:00 2001 From: "github-actions[bot]" <41898282+github-actions[bot]@users.noreply.github.com> Date: Sat, 30 Nov 2024 22:07:25 +0100 Subject: [PATCH 30/44] [v2-10-test] fix gantt flickering #42215 (#44488) (#44517) (cherry picked from commit 0c354e7f6a34ab05b4ce239ece77fd05bbffe9a5) Co-authored-by: darkag --- .../www/static/js/dag/details/gantt/index.tsx | 18 ++++-------------- 1 file changed, 4 insertions(+), 14 deletions(-) diff --git a/airflow/www/static/js/dag/details/gantt/index.tsx b/airflow/www/static/js/dag/details/gantt/index.tsx index 45c10d2b525e5..1ed5c353debba 100644 --- a/airflow/www/static/js/dag/details/gantt/index.tsx +++ b/airflow/www/static/js/dag/details/gantt/index.tsx @@ -144,20 +144,10 @@ const Gantt = ({ // Reset state when the dagrun changes useEffect(() => { - if (startDate !== dagRun?.queuedAt && startDate !== dagRun?.startDate) { - setStartDate(dagRun?.queuedAt || dagRun?.startDate); - } - if (!endDate || endDate !== dagRun?.endDate) { - // @ts-ignore - setEndDate(dagRun?.endDate ?? moment().add(1, "s").toString()); - } - }, [ - dagRun?.queuedAt, - dagRun?.startDate, - dagRun?.endDate, - startDate, - endDate, - ]); + setStartDate(dagRun?.queuedAt || dagRun?.startDate); + // @ts-ignore + setEndDate(dagRun?.endDate ?? moment().add(1, "s").toString()); + }, [dagRun?.queuedAt, dagRun?.startDate, dagRun?.endDate]); const numBars = Math.round(width / 100); const runDuration = getDuration(startDate, endDate); From 24389df4ddaa184f5396ad556f1199ad76de83a9 Mon Sep 17 00:00:00 2001 From: Jarek Potiuk Date: Sun, 1 Dec 2024 00:28:02 +0100 Subject: [PATCH 31/44] [v2-10-test] Allow "/" in metrics validator (#42934) (#44515) * Allow "/" to avoid ERROR - Invalid stat name: dag_processing.processes,file_path=/mnt/c * Add UT * Reformat (cherry picked from commit 14b32eae6761075a9546647f47ba35705c9bac03) Co-authored-by: awdavidson <54780428+awdavidson@users.noreply.github.com> --- airflow/metrics/validators.py | 2 +- tests/core/test_stats.py | 4 ++++ 2 files changed, 5 insertions(+), 1 deletion(-) diff --git a/airflow/metrics/validators.py b/airflow/metrics/validators.py index 111ad9b87df62..d69e57762c23a 100644 --- a/airflow/metrics/validators.py +++ b/airflow/metrics/validators.py @@ -44,7 +44,7 @@ class MetricNameLengthExemptionWarning(Warning): # Only characters in the character set are considered valid # for the stat_name if stat_name_default_handler is used. -ALLOWED_CHARACTERS = frozenset(string.ascii_letters + string.digits + "_.-") +ALLOWED_CHARACTERS = frozenset(string.ascii_letters + string.digits + "_.-/") # The following set contains existing metrics whose names are too long for # OpenTelemetry and should be deprecated over time. This is implemented to diff --git a/tests/core/test_stats.py b/tests/core/test_stats.py index 902a0ed0037f5..e0386cab1b911 100644 --- a/tests/core/test_stats.py +++ b/tests/core/test_stats.py @@ -506,6 +506,10 @@ def test_increment_counter_with_tags(self): ) self.statsd_client.incr.assert_called_once_with("test_stats_run.delay,key0=0,key1=val1", 1, 1) + def test_increment_counter_with_tags_and_forward_slash(self): + self.stats.incr("test_stats_run.dag", tags={"path": "/some/path/dag.py"}) + self.statsd_client.incr.assert_called_once_with("test_stats_run.dag,path=/some/path/dag.py", 1, 1) + def test_does_not_increment_counter_drops_invalid_tags(self): self.stats.incr( "test_stats_run.delay", From a7156922b3f32fcd8f52479e989ed9fab6c9007b Mon Sep 17 00:00:00 2001 From: Jarek Potiuk Date: Sun, 1 Dec 2024 20:37:42 +0100 Subject: [PATCH 32/44] [v2-10-test] Upgrading tomli to 2.2.1 as suggsested by CI (#44444) (#44524) (cherry picked from commit 5474e56f5cbaca7ec7b7045c71c078d448ebe7c8) Co-authored-by: Amogh Desai --- pyproject.toml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/pyproject.toml b/pyproject.toml index 2fcf2e1486858..39191cbb86283 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -29,7 +29,7 @@ requires = [ "pathspec==0.12.1", "pluggy==1.5.0", "smmap==5.0.1", - "tomli==2.1.0; python_version < '3.11'", + "tomli==2.2.1; python_version < '3.11'", "trove-classifiers==2024.10.21.16", ] build-backend = "hatchling.build" From 04735b19b37c68ed82b614a19c9b897a27ca1f2d Mon Sep 17 00:00:00 2001 From: "github-actions[bot]" <41898282+github-actions[bot]@users.noreply.github.com> Date: Mon, 2 Dec 2024 09:42:40 -0700 Subject: [PATCH 33/44] [v2-10-test] Update XCom docs around containers/helm (#44570) (#44573) This removes the whole section about helm, as it really isn't directly related to the XCom concept at all. I also simplified the section about containers as well - this one is a bit more practical, so I've left it. (cherry picked from commit 3747c91afdcd0470ca29e911c589b334b357b778) Co-authored-by: Jed Cunningham <66968678+jedcunningham@users.noreply.github.com> --- docs/apache-airflow/core-concepts/xcoms.rst | 27 +++------------------ 1 file changed, 3 insertions(+), 24 deletions(-) diff --git a/docs/apache-airflow/core-concepts/xcoms.rst b/docs/apache-airflow/core-concepts/xcoms.rst index b6bd160d89d20..fad9420cea642 100644 --- a/docs/apache-airflow/core-concepts/xcoms.rst +++ b/docs/apache-airflow/core-concepts/xcoms.rst @@ -98,36 +98,15 @@ There is also an ``orm_deserialize_value`` method that is called whenever the XC You can also override the ``clear`` method and use it when clearing results for given DAGs and tasks. This allows the custom XCom backend to process the data lifecycle easier. -Working with Custom XCom Backends in Containers ------------------------------------------------ +Verifying Custom XCom Backend usage in Containers +------------------------------------------------- Depending on where Airflow is deployed i.e., local, Docker, K8s, etc. it can be useful to be assured that a custom XCom backend is actually being initialized. For example, the complexity of the container environment can make it more difficult to determine if your backend is being loaded correctly during container deployment. Luckily the following guidance can be used to assist you in building confidence in your custom XCom implementation. -Firstly, if you can exec into a terminal in the container then you should be able to do: +If you can exec into a terminal in an Airflow container, you can then print out the actual XCom class that is being used: .. code-block:: python from airflow.models.xcom import XCom print(XCom.__name__) - -which will print the actual class that is being used. - -You can also examine Airflow's configuration: - -.. code-block:: python - - from airflow.settings import conf - - conf.get("core", "xcom_backend") - -Working with Custom Backends in K8s via Helm --------------------------------------------- - -Running custom XCom backends in K8s will introduce even more complexity to your Airflow deployment. Put simply, sometimes things go wrong which can be difficult to debug. - -For example, if you define a custom XCom backend in the Chart ``values.yaml`` (via the ``xcom_backend`` configuration) and Airflow fails to load the class, the entire Chart deployment will fail with each pod container attempting to restart time and time again. - -When deploying in K8s your custom XCom backend needs to be reside in a ``config`` directory otherwise it cannot be located during Chart deployment. - -An observed problem is that it is very difficult to acquire logs from the container because there is a very small window of availability where the trace can be obtained. The only way you can determine the root cause is if you are fortunate enough to query and acquire the container logs at the right time. This in turn prevents the entire Helm chart from deploying successfully. From 82597181ecb3d00ee07c6449e74612b338d5c43b Mon Sep 17 00:00:00 2001 From: Jarek Potiuk Date: Tue, 3 Dec 2024 07:42:37 +0100 Subject: [PATCH 34/44] Remove comment about removing eval type backport (#44588) --- hatch_build.py | 2 -- 1 file changed, 2 deletions(-) diff --git a/hatch_build.py b/hatch_build.py index 93627da0193b9..f757d2a598ccc 100644 --- a/hatch_build.py +++ b/hatch_build.py @@ -438,8 +438,6 @@ "dill>=0.2.2", # Required for python 3.8 and 3.9 to work with new annotations styles. Check package # description on PyPI for more details: https://pypi.org/project/eval-type-backport/ - # NOTE! THIS MIGHT BE REMOVED BEFORE WE RELEASE 2.10.4 if - # Pydantic 2.10.2 will add eval-type-backport as dependency for Python 3.8/3.9 # see https://github.com/pydantic/pydantic/issues/10958 'eval-type-backport>=0.2.0;python_version<"3.10"', "flask-caching>=2.0.0", From ade67637cfb5f513a5767e85fa9d5a6ec3c8f716 Mon Sep 17 00:00:00 2001 From: LIU ZHE YOU <68415893+jason810496@users.noreply.github.com> Date: Tue, 3 Dec 2024 21:35:19 +0800 Subject: [PATCH 35/44] Fix wrong display of multiline messages in the log after filtering (#44457) * Fix Logs/utils * Fix test for Logs/utils * Fix by Review Comment - Added red color styling to lines based on the `currentLevel` - Added comments for new regExp * Refactor Logs/utils test cases --- .../details/taskInstance/Logs/utils.test.tsx | 80 ++++++++++-- .../js/dag/details/taskInstance/Logs/utils.ts | 115 ++++++++++-------- airflow/www/static/js/utils/index.test.ts | 6 + airflow/www/static/js/utils/index.ts | 6 +- 4 files changed, 140 insertions(+), 67 deletions(-) diff --git a/airflow/www/static/js/dag/details/taskInstance/Logs/utils.test.tsx b/airflow/www/static/js/dag/details/taskInstance/Logs/utils.test.tsx index 8e577068d27e0..57cad4314f6ff 100644 --- a/airflow/www/static/js/dag/details/taskInstance/Logs/utils.test.tsx +++ b/airflow/www/static/js/dag/details/taskInstance/Logs/utils.test.tsx @@ -19,27 +19,60 @@ /* global describe, test, expect */ +import { AnsiUp } from "ansi_up"; import { LogLevel, parseLogs } from "./utils"; -const mockTaskLog = ` -5d28cfda3219 +const mockTaskLogInfoBegin = `5d28cfda3219 *** Reading local file: /root/airflow/logs/dag_id=test_ui_grid/run_id=scheduled__2022-06-03T00:00:00+00:00/task_id=section_1.get_entry_group/attempt=1.log [2022-06-04 00:00:01,901] {taskinstance.py:1132} INFO - Dependencies all met for [2022-06-04 00:00:01,906] {taskinstance.py:1132} INFO - Dependencies all met for [2022-06-04 00:00:01,906] {taskinstance.py:1329} INFO - --------------------------------------------------------------------------------- [2022-06-04 00:00:01,906] {taskinstance.py:1330} INFO - Starting attempt 1 of 1 [2022-06-04 00:00:01,906] {taskinstance.py:1331} INFO - --------------------------------------------------------------------------------- -[2022-06-04 00:00:01,916] {taskinstance.py:1350} INFO - Executing on 2022-06-03 00:00:00+00:00 +`; +const mockTaskLogErrorWithTraceback = `[2022-06-04 00:00:01,910] {taskinstance.py:3311} ERROR - Task failed with exception +Traceback (most recent call last): + File "/opt/airflow/airflow/models/taskinstance.py", line 767, in _execute_task + result = _execute_callable(context=context, **execute_callable_kwargs) + File "/opt/airflow/airflow/models/taskinstance.py", line 733, in _execute_callable + return ExecutionCallableRunner( + File "/opt/airflow/airflow/utils/operator_helpers.py", line 252, in run + return self.func(*args, **kwargs) + File "/opt/airflow/airflow/models/baseoperator.py", line 422, in wrapper + return func(self, *args, **kwargs) + File "/opt/airflow/airflow/operators/python.py", line 505, in execute + return super().execute(context=serializable_context) + File "/opt/airflow/airflow/models/baseoperator.py", line 422, in wrapper + return func(self, *args, **kwargs) + File "/opt/airflow/airflow/operators/python.py", line 238, in execute + return_value = self.execute_callable() + File "/opt/airflow/airflow/operators/python.py", line 870, in execute_callable + result = self._execute_python_callable_in_subprocess(python_path) + File "/opt/airflow/airflow/operators/python.py", line 588, in _execute_python_callable_in_subprocess + raise AirflowException(error_msg) from None +airflow.exceptions.AirflowException: Process returned non-zero exit status 1. +This is log line 1 +This is log line 2 +This is log line 3 +This is log line 4 +This is log line 5 +`; +const mockTaskLogWarning = `[2022-06-04 00:00:02,010] {taskinstance.py:1548} WARNING - Exporting env vars: AIRFLOW_CTX_DAG_OWNER=*** AIRFLOW_CTX_DAG_ID=test_ui_grid`; +const mockTaskLogInfoEndWithWarningAndUrl = `[2022-06-04 00:00:01,914] {taskinstance.py:1225} INFO - Marking task as FAILED. dag_id=reproduce_log_error_dag, task_id=reproduce_log_error_python_task2, run_id=manual__2024-11-30T02:18:22.203608+00:00, execution_date=20241130T021822, start_date=20241130T021842, end_date=20241130T021844 [2022-06-04 00:00:01,919] {standard_task_runner.py:52} INFO - Started process 41646 to run task [2022-06-04 00:00:01,920] {standard_task_runner.py:80} INFO - Running: ['***', 'tasks', 'run', 'test_ui_grid', 'section_1.get_entry_group', 'scheduled__2022-06-03T00:00:00+00:00', '--job-id', '1626', '--raw', '--subdir', 'DAGS_FOLDER/test_ui_grid.py', '--cfg-path', '/tmp/tmpte7k80ur'] [2022-06-04 00:00:01,921] {standard_task_runner.py:81} INFO - Job 1626: Subtask section_1.get_entry_group [2022-06-04 00:00:01,921] {dagbag.py:507} INFO - Filling up the DagBag from /files/dags/test_ui_grid.py [2022-06-04 00:00:01,964] {task_command.py:377} INFO - Running on host 5d28cfda3219 -[2022-06-04 00:00:02,010] {taskinstance.py:1548} WARNING - Exporting env vars: AIRFLOW_CTX_DAG_OWNER=*** AIRFLOW_CTX_DAG_ID=test_ui_grid -[2024-07-01 00:00:02,010] {taskinstance.py:1548} INFO - Url parsing test => "https://apple.com", "https://google.com", https://something.logs/_dashboard/?_g=(filters:!(),refreshInterval:(pause:!t,value:0),time:(from:now-1d,to:now))&_a=(columns:!(_source),filters:!(('$state':(store:appState)))) -`; +${mockTaskLogWarning} +[2024-07-01 00:00:02,010] {taskinstance.py:1548} INFO - Url parsing test => "https://apple.com", "https://google.com", https://something.logs/_dashboard/?_g=(filters:!(),refreshInterval:(pause:!t,value:0),time:(from:now-1d,to:now))&_a=(columns:!(_source),filters:!(('$state':(store:appState))))`; + +const mockTaskLog = `${mockTaskLogInfoBegin}${mockTaskLogErrorWithTraceback}${mockTaskLogInfoEndWithWarningAndUrl}`; +const ansiUp = new AnsiUp(); +const parseExpectedLogs = (logs: string) => { + ansiUp.url_allowlist = {}; + return logs.split("\n").map((line) => ansiUp.ansi_to_html(line)); +}; describe("Test Logs Utils.", () => { test("parseLogs function replaces datetimes", () => { @@ -65,13 +98,18 @@ describe("Test Logs Utils.", () => { test.each([ { logLevelFilters: [LogLevel.INFO], - expectedNumberOfLines: 12, + expectedNumberOfLines: 14, expectedNumberOfFileSources: 4, + expectedLogs: `${mockTaskLogInfoBegin}${mockTaskLogInfoEndWithWarningAndUrl.replace( + mockTaskLogWarning, + "" + )}`, }, { logLevelFilters: [LogLevel.WARNING], expectedNumberOfLines: 1, expectedNumberOfFileSources: 1, + expectedLogs: mockTaskLogWarning, }, ])( "Filtering logs on $logLevelFilters level should return $expectedNumberOfLines lines and $expectedNumberOfFileSources file sources", @@ -79,6 +117,7 @@ describe("Test Logs Utils.", () => { logLevelFilters, expectedNumberOfLines, expectedNumberOfFileSources, + expectedLogs, }) => { const { parsedLogs, fileSources } = parseLogs( mockTaskLog, @@ -91,8 +130,11 @@ describe("Test Logs Utils.", () => { expect(fileSources).toHaveLength(expectedNumberOfFileSources); expect(parsedLogs).toBeDefined(); const lines = parsedLogs!.split("\n"); + const expectedLines = parseExpectedLogs(expectedLogs); expect(lines).toHaveLength(expectedNumberOfLines); - lines.forEach((line) => expect(line).toContain(logLevelFilters[0])); + lines.forEach((line, index) => { + expect(line).toContain(expectedLines[index]); + }); } ); @@ -104,6 +146,14 @@ describe("Test Logs Utils.", () => { ["taskinstance.py"], [] ); + const expectedLogs = `[2022-06-04 00:00:01,901] {taskinstance.py:1132} INFO - Dependencies all met for +[2022-06-04 00:00:01,906] {taskinstance.py:1132} INFO - Dependencies all met for +[2022-06-04 00:00:01,906] {taskinstance.py:1329} INFO - +[2022-06-04 00:00:01,906] {taskinstance.py:1330} INFO - Starting attempt 1 of 1 +[2022-06-04 00:00:01,906] {taskinstance.py:1331} INFO - +${mockTaskLogErrorWithTraceback} +${mockTaskLogWarning} +[2024-07-01 00:00:02,010] {taskinstance.py:1548} INFO -`; // Ignore matching for transformed hyperlinks; only verify that all the correct lines are returned. expect(fileSources).toEqual([ "dagbag.py", @@ -112,8 +162,11 @@ describe("Test Logs Utils.", () => { "taskinstance.py", ]); const lines = parsedLogs!.split("\n"); - expect(lines).toHaveLength(8); - lines.forEach((line) => expect(line).toContain("taskinstance.py")); + const expectedLines = parseExpectedLogs(expectedLogs); + expect(lines).toHaveLength(34); + lines.forEach((line, index) => { + expect(line).toContain(expectedLines[index]); + }); }); test("parseLogs function with filter on log level and file source", () => { @@ -145,7 +198,8 @@ describe("Test Logs Utils.", () => { [] ); - const lines = parsedLogs!.split("\n"); + // remove the last line which is empty + const lines = parsedLogs!.split("\n").filter((line) => line.length > 0); expect(lines[lines.length - 1]).toContain( 'https://apple.com' ); diff --git a/airflow/www/static/js/dag/details/taskInstance/Logs/utils.ts b/airflow/www/static/js/dag/details/taskInstance/Logs/utils.ts index f5340f0afb82a..b35a713484e14 100644 --- a/airflow/www/static/js/dag/details/taskInstance/Logs/utils.ts +++ b/airflow/www/static/js/dag/details/taskInstance/Logs/utils.ts @@ -59,7 +59,7 @@ export const logGroupEnd = / INFO - (::|##\[])endgroup(::|\])/g; export const parseLogs = ( data: string | undefined, timezone: string | null, - logLevelFilters: Array, + logLevelFilters: Array, fileSourceFilters: Array, unfoldedLogGroups: Array ) => { @@ -79,6 +79,8 @@ export const parseLogs = ( const parsedLines: Array = []; const fileSources: Set = new Set(); + const targetLogLevels: Set = new Set(logLevelFilters); + const targetFileSources: Set = new Set(fileSourceFilters); const ansiUp = new AnsiUp(); ansiUp.url_allowlist = {}; @@ -87,24 +89,20 @@ export const parseLogs = ( // Coloring (blue-60 as chakra style, is #0060df) and style such that log group appears like a link const logGroupStyle = "color:#0060df;cursor:pointer;font-weight:bold;"; + // Example Log Format: [2021-08-26 00:00:00,000] {filename.py:42} INFO - Log message + const regExp = /\[(.*?)\] \{(.*?)\} (.*?) -/; + let currentLevel: LogLevel = LogLevel.INFO; + let currentFileSource = ""; lines.forEach((line) => { let parsedLine = line; - - // Apply log level filter. - if ( - logLevelFilters.length > 0 && - logLevelFilters.every((level) => !line.includes(level)) - ) { - return; - } - - const regExp = /\[(.*?)\] \{(.*?)\}/; const matches = line.match(regExp); - let logGroup = ""; + let fileSource = ""; if (matches) { // Replace UTC with the local timezone. const dateTime = matches[1]; - [logGroup] = matches[2].split(":"); + [fileSource] = matches[2].split(":"); + const logLevel = matches[3]; + if (dateTime && timezone) { // @ts-ignore const localDateTime = moment @@ -115,50 +113,63 @@ export const parseLogs = ( parsedLine = line.replace(dateTime, localDateTime); } - fileSources.add(logGroup); + // The `currentLogLevel` and `currentFileSource` should remain same + // until a new `logLevel` or `fileSource` is encountered. + currentLevel = logLevel as LogLevel; + currentFileSource = fileSource; } + // Apply log level filter. + if (logLevelFilters.length > 0 && !targetLogLevels.has(currentLevel)) { + return; + } + if (fileSource) { + // Only add file source if it is not empty. + fileSources.add(fileSource); + } + // Apply file source filter. if ( - fileSourceFilters.length === 0 || - fileSourceFilters.some((fileSourceFilter) => - line.includes(fileSourceFilter) - ) + fileSourceFilters.length > 0 && + !targetFileSources.has(currentFileSource) ) { - parsedLine = highlightByKeywords( - parsedLine, - errorKeywords, - warningKeywords, - logGroupStart, - logGroupEnd - ); - // for lines with color convert to nice HTML - const coloredLine = ansiUp.ansi_to_html(parsedLine); - - // for lines with links, transform to hyperlinks - const lineWithHyperlinks = coloredLine - .replace( - urlRegex, - (url) => - `${url}` - ) - .replace(logGroupStart, (textLine) => { - const unfoldIdSuffix = "_unfold"; - const foldIdSuffix = "_fold"; - const gName = textLine.substring(17); - const gId = gName.replace(/\W+/g, "_").toLowerCase(); - const isFolded = unfoldedLogGroups.indexOf(gId) === -1; - const ufDisplay = isFolded ? "" : "display:none;"; - const unfold = ` ▶ ${gName}`; - const fDisplay = isFolded ? "display:none;" : ""; - const fold = ` ▼ ${gName}`; - return unfold + fold; - }) - .replace( - logGroupEnd, - " ▲▲▲ Log group end" - ); - parsedLines.push(lineWithHyperlinks); + return; } + + parsedLine = highlightByKeywords( + parsedLine, + currentLevel, + errorKeywords, + warningKeywords, + logGroupStart, + logGroupEnd + ); + // for lines with color convert to nice HTML + const coloredLine = ansiUp.ansi_to_html(parsedLine); + + // for lines with links, transform to hyperlinks + const lineWithHyperlinks = coloredLine + .replace( + urlRegex, + (url) => + `${url}` + ) + .replace(logGroupStart, (textLine) => { + const unfoldIdSuffix = "_unfold"; + const foldIdSuffix = "_fold"; + const gName = textLine.substring(17); + const gId = gName.replace(/\W+/g, "_").toLowerCase(); + const isFolded = unfoldedLogGroups.indexOf(gId) === -1; + const ufDisplay = isFolded ? "" : "display:none;"; + const unfold = ` ▶ ${gName}`; + const fDisplay = isFolded ? "display:none;" : ""; + const fold = ` ▼ ${gName}`; + return unfold + fold; + }) + .replace( + logGroupEnd, + " ▲▲▲ Log group end" + ); + parsedLines.push(lineWithHyperlinks); }); return { diff --git a/airflow/www/static/js/utils/index.test.ts b/airflow/www/static/js/utils/index.test.ts index 569d3af98b537..26b1fd6d84033 100644 --- a/airflow/www/static/js/utils/index.test.ts +++ b/airflow/www/static/js/utils/index.test.ts @@ -163,6 +163,7 @@ describe("Test highlightByKeywords", () => { const expected = `\x1b[1m\x1b[31mline with Error\x1b[39m\x1b[0m`; const highlightedLine = highlightByKeywords( originalLine, + "", ["error"], ["warn"], logGroupStart, @@ -175,6 +176,7 @@ describe("Test highlightByKeywords", () => { const expected = `\x1b[1m\x1b[33mline with Warning\x1b[39m\x1b[0m`; const highlightedLine = highlightByKeywords( originalLine, + "", ["error"], ["warn"], logGroupStart, @@ -187,6 +189,7 @@ describe("Test highlightByKeywords", () => { const expected = `\x1b[1m\x1b[31mline with error Warning\x1b[39m\x1b[0m`; const highlightedLine = highlightByKeywords( originalLine, + "", ["error"], ["warn"], logGroupStart, @@ -198,6 +201,7 @@ describe("Test highlightByKeywords", () => { const originalLine = " INFO - ::group::error"; const highlightedLine = highlightByKeywords( originalLine, + "", ["error"], ["warn"], logGroupStart, @@ -209,6 +213,7 @@ describe("Test highlightByKeywords", () => { const originalLine = " INFO - ::endgroup::"; const highlightedLine = highlightByKeywords( originalLine, + "", ["endgroup"], ["warn"], logGroupStart, @@ -220,6 +225,7 @@ describe("Test highlightByKeywords", () => { const originalLine = "sample line"; const highlightedLine = highlightByKeywords( originalLine, + "", ["error"], ["warn"], logGroupStart, diff --git a/airflow/www/static/js/utils/index.ts b/airflow/www/static/js/utils/index.ts index 8bef31a8582a9..2742612dc8a05 100644 --- a/airflow/www/static/js/utils/index.ts +++ b/airflow/www/static/js/utils/index.ts @@ -20,6 +20,7 @@ import Color from "color"; import type { DagRun, RunOrdering, Task, TaskInstance } from "src/types"; +import { LogLevel } from "src/dag/details/taskInstance/Logs/utils"; import useOffsetTop from "./useOffsetTop"; // Delay in ms for various hover actions @@ -187,6 +188,7 @@ const toSentenceCase = (camelCase: string): string => { const highlightByKeywords = ( parsedLine: string, + currentLogLevel: string, errorKeywords: string[], warningKeywords: string[], logGroupStart: RegExp, @@ -205,7 +207,7 @@ const highlightByKeywords = ( lowerParsedLine.includes(keyword) ); - if (containsError) { + if (containsError || currentLogLevel === (LogLevel.ERROR as string)) { return red(parsedLine); } @@ -213,7 +215,7 @@ const highlightByKeywords = ( lowerParsedLine.includes(keyword) ); - if (containsWarning) { + if (containsWarning || currentLogLevel === (LogLevel.WARNING as string)) { return yellow(parsedLine); } From 48d2a24d4de9390264fca9ad5c04c9d4f6992a55 Mon Sep 17 00:00:00 2001 From: "github-actions[bot]" <41898282+github-actions[bot]@users.noreply.github.com> Date: Tue, 3 Dec 2024 22:58:10 +0200 Subject: [PATCH 36/44] [v2-10-test] Fix tests badge in README.md (#44505) (#44587) * [v2-10-test] Fix tests badge in README.md (#44505) (cherry picked from commit a242ff6edb51db6e8ac1ced72b5bf327dd98c1b4) Co-authored-by: Shahar Epstein <60007259+shahar1@users.noreply.github.com> * Adjust badge to `v2-10-test` branch * Update README.md * Fix PYPI README.md --------- Co-authored-by: Shahar Epstein <60007259+shahar1@users.noreply.github.com> --- README.md | 2 +- generated/PYPI_README.md | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 2d3145ff6c706..181c193e6ab88 100644 --- a/README.md +++ b/README.md @@ -21,7 +21,7 @@ # Apache Airflow [![PyPI version](https://badge.fury.io/py/apache-airflow.svg)](https://badge.fury.io/py/apache-airflow) -[![GitHub Build](https://github.com/apache/airflow/workflows/Tests/badge.svg)](https://github.com/apache/airflow/actions) +[![GitHub Build](https://github.com/apache/airflow/actions/workflows/ci.yml/badge.svg?branch=v2-10-test)](https://github.com/apache/airflow/actions/workflows/ci.yml?query=branch%3Av2-10-test) [![Coverage Status](https://codecov.io/gh/apache/airflow/graph/badge.svg?token=WdLKlKHOAU)](https://codecov.io/gh/apache/airflow) [![License](https://img.shields.io/:license-Apache%202-blue.svg)](https://www.apache.org/licenses/LICENSE-2.0.txt) [![PyPI - Python Version](https://img.shields.io/pypi/pyversions/apache-airflow.svg)](https://pypi.org/project/apache-airflow/) diff --git a/generated/PYPI_README.md b/generated/PYPI_README.md index a9f9ff42f0abb..a97566f4b68ee 100644 --- a/generated/PYPI_README.md +++ b/generated/PYPI_README.md @@ -23,7 +23,7 @@ PROJECT BY THE `generate-pypi-readme` PRE-COMMIT. YOUR CHANGES HERE WILL BE AUTO # Apache Airflow [![PyPI version](https://badge.fury.io/py/apache-airflow.svg)](https://badge.fury.io/py/apache-airflow) -[![GitHub Build](https://github.com/apache/airflow/workflows/Tests/badge.svg)](https://github.com/apache/airflow/actions) +[![GitHub Build](https://github.com/apache/airflow/actions/workflows/ci.yml/badge.svg?branch=v2-10-test)](https://github.com/apache/airflow/actions/workflows/ci.yml?query=branch%3Av2-10-test) [![Coverage Status](https://codecov.io/gh/apache/airflow/graph/badge.svg?token=WdLKlKHOAU)](https://codecov.io/gh/apache/airflow) [![License](https://img.shields.io/:license-Apache%202-blue.svg)](https://www.apache.org/licenses/LICENSE-2.0.txt) [![PyPI - Python Version](https://img.shields.io/pypi/pyversions/apache-airflow.svg)](https://pypi.org/project/apache-airflow/) From 607682c33f81a31fd1458219957d559bb036e6e1 Mon Sep 17 00:00:00 2001 From: Utkarsh Sharma Date: Wed, 4 Dec 2024 19:04:46 +0530 Subject: [PATCH 37/44] Fix test_deprecated_options_with_new_section (#44647) (cherry picked from commit b4f4ba89dfc66ac9e7b44194af255ac031bf5a7a) --- tests/core/test_configuration.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/tests/core/test_configuration.py b/tests/core/test_configuration.py index b200d16baad8a..58e1e029af84d 100644 --- a/tests/core/test_configuration.py +++ b/tests/core/test_configuration.py @@ -974,7 +974,7 @@ def test_deprecated_options_with_new_section(self): with mock.patch.dict("os.environ", AIRFLOW__CORE__LOGGING_LEVEL="VALUE"): assert conf.get("logging", "logging_level") == "VALUE" - with pytest.warns(FutureWarning, match="Please update your `conf.get"): + with pytest.warns(DeprecationWarning, match=r"The logging_level option in \[core\]"): with mock.patch.dict("os.environ", AIRFLOW__CORE__LOGGING_LEVEL="VALUE"): assert conf.get("core", "logging_level") == "VALUE" From 1dc0cc4aa3e03c3b7019b3a04981c6d0b4a2f1e6 Mon Sep 17 00:00:00 2001 From: Antony Southworth <81115196+antonysouthworth-halter@users.noreply.github.com> Date: Tue, 10 Dec 2024 03:49:31 +1300 Subject: [PATCH 38/44] Double-check TaskInstance state if it differs from the Executor state. (#43063) * Double-check TaskInstance state if it differs from Executor. * Update airflow/jobs/backfill_job_runner.py * Update airflow/jobs/backfill_job_runner.py * Update airflow/jobs/backfill_job_runner.py * Update airflow/jobs/backfill_job_runner.py * Update airflow/jobs/backfill_job_runner.py * Update airflow/jobs/backfill_job_runner.py --------- Co-authored-by: Utkarsh Sharma (cherry picked from commit 90d6332bd15521c0ecb7f1e760056ac317e2b940) --- airflow/jobs/backfill_job_runner.py | 11 +++++++++++ 1 file changed, 11 insertions(+) diff --git a/airflow/jobs/backfill_job_runner.py b/airflow/jobs/backfill_job_runner.py index 961c4b7e020b3..305eaff84be7d 100644 --- a/airflow/jobs/backfill_job_runner.py +++ b/airflow/jobs/backfill_job_runner.py @@ -309,6 +309,17 @@ def _manage_executor_state( self.log.debug("Executor state: %s task %s", state, ti) + if ( + state in (TaskInstanceState.FAILED, TaskInstanceState.SUCCESS) + and ti.state in self.STATES_COUNT_AS_RUNNING + ): + self.log.debug( + "In-memory TaskInstance state %s does not agree with executor state %s. Attempting to resolve by refreshing in-memory task instance from DB.", + ti, + state, + ) + ti.refresh_from_db(session=session) + if ( state in (TaskInstanceState.FAILED, TaskInstanceState.SUCCESS) and ti.state in self.STATES_COUNT_AS_RUNNING From 4f9cf5460980d24838811951c41d41fcc1fe8f4f Mon Sep 17 00:00:00 2001 From: "github-actions[bot]" <41898282+github-actions[bot]@users.noreply.github.com> Date: Sat, 7 Dec 2024 12:14:14 +0200 Subject: [PATCH 39/44] [v2-10-test] Random doc typos (#44750) (#44758) * Random doc typos * Update contributing-docs/testing/unit_tests.rst * Update contributing-docs/testing/unit_tests.rst --------- (cherry picked from commit 909ff713a47d6217592f919cddbbb6967e0fddf0) Co-authored-by: D. Ferruzzi Co-authored-by: Shahar Epstein <60007259+shahar1@users.noreply.github.com> (cherry picked from commit b5f033a933d6bba2433a50a62b389b6546123ac4) --- contributing-docs/testing/unit_tests.rst | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/contributing-docs/testing/unit_tests.rst b/contributing-docs/testing/unit_tests.rst index c83f391e52817..ccd38250424df 100644 --- a/contributing-docs/testing/unit_tests.rst +++ b/contributing-docs/testing/unit_tests.rst @@ -378,10 +378,10 @@ If your test accesses the database but is not marked properly the Non-DB test in How to verify if DB test is correctly classified ................................................ -When you add if you want to see if your DB test is correctly classified, you can run the test or group +If you want to see if your DB test is correctly classified, you can run the test or group of tests with ``--skip-db-tests`` flag. -You can run the all (or subset of) test types if you want to make sure all ot the problems are fixed +You can run the all (or subset of) test types if you want to make sure all of the problems are fixed .. code-block:: bash @@ -502,8 +502,8 @@ Do this: Problems with Non-DB test collection ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -Sometimes, even if whole module is marked as ``@pytest.mark.db_test`` even parsing the file and collecting -tests will fail when ``--skip-db-tests`` is used because some of the imports od objects created in the +Sometimes, even if the whole module is marked as ``@pytest.mark.db_test``, parsing the file and collecting +tests will fail when ``--skip-db-tests`` is used because some of the imports or objects created in the module will read the database. Usually what helps is to move such initialization code to inside the tests or pytest fixtures (and pass @@ -1162,9 +1162,9 @@ directly to the container. Implementing compatibility for provider tests for older Airflow versions ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -When you implement tests for providers, you should make sure that they are compatible with older +When you implement tests for providers, you should make sure that they are compatible with older Airflow versions. -Note that some of the tests if written without taking care about the compatibility, might not work with older +Note that some of the tests, if written without taking care about the compatibility, might not work with older versions of Airflow - this is because of refactorings, renames, and tests relying on internals of Airflow that are not part of the public API. We deal with it in one of the following ways: From d8e93c2ee01ae6a58192b0f885ec42a9f6035356 Mon Sep 17 00:00:00 2001 From: Shahar Epstein <60007259+shahar1@users.noreply.github.com> Date: Fri, 8 Nov 2024 09:12:16 +0200 Subject: [PATCH 40/44] Prevent using trigger_rule="always" in a dynamic mapped task (#43810) (cherry picked from commit c753ca295d72d4e3dd74b9131d3ca4c47899cd96) (cherry picked from commit c9436b4dce756510ab2bdfebdfb4d119d55a9b29) --- airflow/utils/task_group.py | 22 +++++++++++++--- .../dynamic-task-mapping.rst | 5 ++++ tests/decorators/test_task_group.py | 25 ++++++++++++++++++- 3 files changed, 47 insertions(+), 5 deletions(-) diff --git a/airflow/utils/task_group.py b/airflow/utils/task_group.py index d1dd9822be222..f5e95bde1a840 100644 --- a/airflow/utils/task_group.py +++ b/airflow/utils/task_group.py @@ -37,6 +37,7 @@ from airflow.models.taskmixin import DAGNode from airflow.serialization.enums import DagAttributeTypes from airflow.utils.helpers import validate_group_key, validate_instance_args +from airflow.utils.trigger_rule import TriggerRule if TYPE_CHECKING: from sqlalchemy.orm import Session @@ -220,10 +221,15 @@ def parent_group(self) -> TaskGroup | None: def __iter__(self): for child in self.children.values(): - if isinstance(child, TaskGroup): - yield from child - else: - yield child + yield from self._iter_child(child) + + @staticmethod + def _iter_child(child): + """Iterate over the children of this TaskGroup.""" + if isinstance(child, TaskGroup): + yield from child + else: + yield child def add(self, task: DAGNode) -> DAGNode: """ @@ -599,6 +605,14 @@ def __init__(self, *, expand_input: ExpandInput, **kwargs: Any) -> None: super().__init__(**kwargs) self._expand_input = expand_input + def __iter__(self): + from airflow.models.abstractoperator import AbstractOperator + + for child in self.children.values(): + if isinstance(child, AbstractOperator) and child.trigger_rule == TriggerRule.ALWAYS: + raise ValueError("Tasks in a mapped task group cannot have trigger_rule set to 'ALWAYS'") + yield from self._iter_child(child) + def iter_mapped_dependencies(self) -> Iterator[Operator]: """Upstream dependencies that provide XComs used by this mapped task group.""" from airflow.models.xcom_arg import XComArg diff --git a/docs/apache-airflow/authoring-and-scheduling/dynamic-task-mapping.rst b/docs/apache-airflow/authoring-and-scheduling/dynamic-task-mapping.rst index fd7d570785434..df74038fd2c05 100644 --- a/docs/apache-airflow/authoring-and-scheduling/dynamic-task-mapping.rst +++ b/docs/apache-airflow/authoring-and-scheduling/dynamic-task-mapping.rst @@ -84,6 +84,11 @@ The grid view also provides visibility into your mapped tasks in the details pan Although we show a "reduce" task here (``sum_it``) you don't have to have one, the mapped tasks will still be executed even if they have no downstream tasks. +.. warning:: ``TriggerRule.ALWAYS`` cannot be utilized in expanded tasks + + Assigning ``trigger_rule=TriggerRule.ALWAYS`` in expanded tasks is forbidden, as expanded parameters will be undefined with the task's immediate execution. + This is enforced at the time of the DAG parsing, and will raise an error if you try to use it. + Task-generated Mapping ---------------------- diff --git a/tests/decorators/test_task_group.py b/tests/decorators/test_task_group.py index 6120f94af3ac7..2dab23ca38fc7 100644 --- a/tests/decorators/test_task_group.py +++ b/tests/decorators/test_task_group.py @@ -22,10 +22,11 @@ import pendulum import pytest -from airflow.decorators import dag, task_group +from airflow.decorators import dag, task, task_group from airflow.models.expandinput import DictOfListsExpandInput, ListOfDictsExpandInput, MappedArgument from airflow.operators.empty import EmptyOperator from airflow.utils.task_group import MappedTaskGroup +from airflow.utils.trigger_rule import TriggerRule def test_task_group_with_overridden_kwargs(): @@ -133,6 +134,28 @@ def tg(): assert str(ctx.value) == "no arguments to expand against" +@pytest.mark.db_test +def test_expand_fail_trigger_rule_always(dag_maker, session): + @dag(schedule=None, start_date=pendulum.datetime(2022, 1, 1)) + def pipeline(): + @task + def get_param(): + return ["a", "b", "c"] + + @task(trigger_rule=TriggerRule.ALWAYS) + def t1(param): + return param + + @task_group() + def tg(param): + t1(param) + + with pytest.raises( + ValueError, match="Tasks in a mapped task group cannot have trigger_rule set to 'ALWAYS'" + ): + tg.expand(param=get_param()) + + def test_expand_create_mapped(): saved = {} From 9e3a976cfa27f27e6f88cdff6621395eb3f09827 Mon Sep 17 00:00:00 2001 From: Shahar Epstein <60007259+shahar1@users.noreply.github.com> Date: Sun, 8 Dec 2024 08:16:33 +0200 Subject: [PATCH 41/44] [BACKPORT] Prevent using `trigger_rule=TriggerRule.ALWAYS` in a task-generated mapping within bare tasks (#44751) (#44769) (cherry picked from commit 99e713efa398bef9d76eb4e8145538e828a05720) --- airflow/decorators/base.py | 21 ++++++++++ airflow/utils/task_group.py | 4 +- .../dynamic-task-mapping.rst | 10 +++-- newsfragments/44751.bugfix.rst | 1 + tests/decorators/test_mapped.py | 41 +++++++++++++++++++ tests/decorators/test_task_group.py | 5 ++- 6 files changed, 75 insertions(+), 7 deletions(-) create mode 100644 newsfragments/44751.bugfix.rst diff --git a/airflow/decorators/base.py b/airflow/decorators/base.py index bcb64aaa6eb3c..c0d46df67f188 100644 --- a/airflow/decorators/base.py +++ b/airflow/decorators/base.py @@ -403,6 +403,12 @@ def _validate_arg_names(self, func: ValidationSource, kwargs: dict[str, Any]): super()._validate_arg_names(func, kwargs) def expand(self, **map_kwargs: OperatorExpandArgument) -> XComArg: + if self.kwargs.get("trigger_rule") == TriggerRule.ALWAYS and any( + [isinstance(expanded, XComArg) for expanded in map_kwargs.values()] + ): + raise ValueError( + "Task-generated mapping within a task using 'expand' is not allowed with trigger rule 'always'." + ) if not map_kwargs: raise TypeError("no arguments to expand against") self._validate_arg_names("expand", map_kwargs) @@ -416,6 +422,21 @@ def expand(self, **map_kwargs: OperatorExpandArgument) -> XComArg: return self._expand(DictOfListsExpandInput(map_kwargs), strict=False) def expand_kwargs(self, kwargs: OperatorExpandKwargsArgument, *, strict: bool = True) -> XComArg: + if ( + self.kwargs.get("trigger_rule") == TriggerRule.ALWAYS + and not isinstance(kwargs, XComArg) + and any( + [ + isinstance(v, XComArg) + for kwarg in kwargs + if not isinstance(kwarg, XComArg) + for v in kwarg.values() + ] + ) + ): + raise ValueError( + "Task-generated mapping within a task using 'expand_kwargs' is not allowed with trigger rule 'always'." + ) if isinstance(kwargs, Sequence): for item in kwargs: if not isinstance(item, (XComArg, Mapping)): diff --git a/airflow/utils/task_group.py b/airflow/utils/task_group.py index f5e95bde1a840..2a4dadf5fd6ad 100644 --- a/airflow/utils/task_group.py +++ b/airflow/utils/task_group.py @@ -610,7 +610,9 @@ def __iter__(self): for child in self.children.values(): if isinstance(child, AbstractOperator) and child.trigger_rule == TriggerRule.ALWAYS: - raise ValueError("Tasks in a mapped task group cannot have trigger_rule set to 'ALWAYS'") + raise ValueError( + "Task-generated mapping within a mapped task group is not allowed with trigger rule 'always'" + ) yield from self._iter_child(child) def iter_mapped_dependencies(self) -> Iterator[Operator]: diff --git a/docs/apache-airflow/authoring-and-scheduling/dynamic-task-mapping.rst b/docs/apache-airflow/authoring-and-scheduling/dynamic-task-mapping.rst index df74038fd2c05..7607fd18a279e 100644 --- a/docs/apache-airflow/authoring-and-scheduling/dynamic-task-mapping.rst +++ b/docs/apache-airflow/authoring-and-scheduling/dynamic-task-mapping.rst @@ -84,10 +84,6 @@ The grid view also provides visibility into your mapped tasks in the details pan Although we show a "reduce" task here (``sum_it``) you don't have to have one, the mapped tasks will still be executed even if they have no downstream tasks. -.. warning:: ``TriggerRule.ALWAYS`` cannot be utilized in expanded tasks - - Assigning ``trigger_rule=TriggerRule.ALWAYS`` in expanded tasks is forbidden, as expanded parameters will be undefined with the task's immediate execution. - This is enforced at the time of the DAG parsing, and will raise an error if you try to use it. Task-generated Mapping ---------------------- @@ -113,6 +109,12 @@ The above examples we've shown could all be achieved with a ``for`` loop in the The ``make_list`` task runs as a normal task and must return a list or dict (see `What data types can be expanded?`_), and then the ``consumer`` task will be called four times, once with each value in the return of ``make_list``. +.. warning:: Task-generated mapping cannot be utilized with ``TriggerRule.ALWAYS`` + + Assigning ``trigger_rule=TriggerRule.ALWAYS`` in task-generated mapping is not allowed, as expanded parameters are undefined with the task's immediate execution. + This is enforced at the time of the DAG parsing, for both tasks and mapped tasks groups, and will raise an error if you try to use it. + In the recent example, setting ``trigger_rule=TriggerRule.ALWAYS`` in the ``consumer`` task will raise an error since ``make_list`` is a task-generated mapping. + Repeated mapping ---------------- diff --git a/newsfragments/44751.bugfix.rst b/newsfragments/44751.bugfix.rst new file mode 100644 index 0000000000000..c85601d0fe13a --- /dev/null +++ b/newsfragments/44751.bugfix.rst @@ -0,0 +1 @@ +``TriggerRule.ALWAYS`` cannot be utilized within a task-generated mapping, either in bare tasks (fixed in this PR) or mapped task groups (fixed in PR #44368). The issue with doing so, is that the task is immediately executed without waiting for the upstreams's mapping results, which certainly leads to failure of the task. This fix avoids it by raising an exception when it is detected during DAG parsing. diff --git a/tests/decorators/test_mapped.py b/tests/decorators/test_mapped.py index 3812367425f8b..541d327a97570 100644 --- a/tests/decorators/test_mapped.py +++ b/tests/decorators/test_mapped.py @@ -17,6 +17,9 @@ # under the License. from __future__ import annotations +import pytest + +from airflow.decorators import task from airflow.models.dag import DAG from airflow.utils.task_group import TaskGroup from tests.models import DEFAULT_DATE @@ -36,3 +39,41 @@ def f(z): dag.get_task("t1") == x1.operator dag.get_task("g.t2") == x2.operator + + +@pytest.mark.db_test +def test_fail_task_generated_mapping_with_trigger_rule_always__exapnd(dag_maker, session): + with DAG(dag_id="d", schedule=None, start_date=DEFAULT_DATE): + + @task + def get_input(): + return ["world", "moon"] + + @task(trigger_rule="always") + def hello(input): + print(f"Hello, {input}") + + with pytest.raises( + ValueError, + match="Task-generated mapping within a task using 'expand' is not allowed with trigger rule 'always'", + ): + hello.expand(input=get_input()) + + +@pytest.mark.db_test +def test_fail_task_generated_mapping_with_trigger_rule_always__exapnd_kwargs(dag_maker, session): + with DAG(dag_id="d", schedule=None, start_date=DEFAULT_DATE): + + @task + def get_input(): + return ["world", "moon"] + + @task(trigger_rule="always") + def hello(input, input2): + print(f"Hello, {input}, {input2}") + + with pytest.raises( + ValueError, + match="Task-generated mapping within a task using 'expand_kwargs' is not allowed with trigger rule 'always'", + ): + hello.expand_kwargs([{"input": get_input(), "input2": get_input()}]) diff --git a/tests/decorators/test_task_group.py b/tests/decorators/test_task_group.py index 2dab23ca38fc7..ce1b518a8ff59 100644 --- a/tests/decorators/test_task_group.py +++ b/tests/decorators/test_task_group.py @@ -135,7 +135,7 @@ def tg(): @pytest.mark.db_test -def test_expand_fail_trigger_rule_always(dag_maker, session): +def test_fail_task_generated_mapping_with_trigger_rule_always(dag_maker, session): @dag(schedule=None, start_date=pendulum.datetime(2022, 1, 1)) def pipeline(): @task @@ -151,7 +151,8 @@ def tg(param): t1(param) with pytest.raises( - ValueError, match="Tasks in a mapped task group cannot have trigger_rule set to 'ALWAYS'" + ValueError, + match="Task-generated mapping within a mapped task group is not allowed with trigger rule 'always'", ): tg.expand(param=get_param()) From 17495d79b12a17a0607ebef4767f3054e60f9905 Mon Sep 17 00:00:00 2001 From: Utkarsh Sharma Date: Tue, 10 Dec 2024 14:31:13 +0530 Subject: [PATCH 42/44] Fixing cli test failure in CI (#44679) (#44806) * Fixing cli test failure in CI * review comments (cherry picked from commit 98e0977a53ea3dc55987f5a2c512fb3b590d3d1c) Co-authored-by: Amogh Desai (cherry picked from commit c942b55f9f1d724d2c16f7129967e5dbff77cff3) --- tests/cli/test_cli_parser.py | 6 ++---- 1 file changed, 2 insertions(+), 4 deletions(-) diff --git a/tests/cli/test_cli_parser.py b/tests/cli/test_cli_parser.py index 2244b6dbd5860..b85c925cf17a8 100644 --- a/tests/cli/test_cli_parser.py +++ b/tests/cli/test_cli_parser.py @@ -417,10 +417,8 @@ def test_invalid_choice_raises_for_export_format_in_db_export_archived_command( ["db", "export-archived", "--export-format", export_format, "--output-path", "mydir"] ) error_msg = stderr.getvalue() - assert error_msg == ( - "\nairflow db export-archived command error: argument " - f"--export-format: invalid choice: '{export_format}' " - "(choose from 'csv'), see help above.\n" + assert ( + "airflow db export-archived command error: argument --export-format: invalid choice" in error_msg ) @pytest.mark.parametrize( From 0fc487d355856d3209ced13d0e5740ed56535bb0 Mon Sep 17 00:00:00 2001 From: utkarsh sharma Date: Wed, 4 Dec 2024 15:19:04 +0530 Subject: [PATCH 43/44] Update version to 2.10.4 --- README.md | 12 ++++++------ airflow/__init__.py | 2 +- airflow/api_connexion/openapi/v1.yaml | 2 +- .../installation/supported-versions.rst | 2 +- docs/docker-stack/README.md | 10 +++++----- .../extending/add-airflow-configuration/Dockerfile | 2 +- .../extending/add-apt-packages/Dockerfile | 2 +- .../add-build-essential-extend/Dockerfile | 2 +- .../extending/add-providers/Dockerfile | 2 +- .../add-pypi-packages-constraints/Dockerfile | 2 +- .../extending/add-pypi-packages-uv/Dockerfile | 2 +- .../extending/add-pypi-packages/Dockerfile | 2 +- .../extending/add-requirement-packages/Dockerfile | 2 +- .../extending/custom-providers/Dockerfile | 2 +- .../extending/embedding-dags/Dockerfile | 2 +- .../extending/writable-directory/Dockerfile | 2 +- docs/docker-stack/entrypoint.rst | 14 +++++++------- generated/PYPI_README.md | 10 +++++----- scripts/ci/pre_commit/supported_versions.py | 2 +- 19 files changed, 38 insertions(+), 38 deletions(-) diff --git a/README.md b/README.md index 181c193e6ab88..8da91a71f9ad8 100644 --- a/README.md +++ b/README.md @@ -97,9 +97,9 @@ Airflow is not a streaming solution, but it is often used to process real-time d Apache Airflow is tested with: -| | Main version (dev) | Stable version (2.10.3) | +| | Main version (dev) | Stable version (2.10.4) | |-------------|------------------------------|------------------------------| -| Python | 3.8, 3.9, 3.10, 3.11, 3.12 | 3.8, 3.9, 3.10, 3.11, 3.12 | +| Python | 3.9, 3.10, 3.11, 3.12 | 3.8, 3.9, 3.10, 3.11, 3.12 | | Platform | AMD64/ARM64(\*) | AMD64/ARM64(\*) | | Kubernetes | 1.26, 1.27, 1.28, 1.29, 1.30 | 1.26, 1.27, 1.28, 1.29, 1.30 | | PostgreSQL | 12, 13, 14, 15, 16 | 12, 13, 14, 15, 16 | @@ -175,15 +175,15 @@ them to the appropriate format and workflow that your tool requires. ```bash -pip install 'apache-airflow==2.10.3' \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.10.3/constraints-3.8.txt" +pip install 'apache-airflow==2.10.4' \ + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.10.4/constraints-3.8.txt" ``` 2. Installing with extras (i.e., postgres, google) ```bash pip install 'apache-airflow[postgres,google]==2.8.3' \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.10.3/constraints-3.8.txt" + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.10.4/constraints-3.8.txt" ``` For information on installing provider packages, check @@ -288,7 +288,7 @@ Apache Airflow version life cycle: | Version | Current Patch/Minor | State | First Release | Limited Support | EOL/Terminated | |-----------|-----------------------|-----------|-----------------|-------------------|------------------| -| 2 | 2.10.3 | Supported | Dec 17, 2020 | TBD | TBD | +| 2 | 2.10.4 | Supported | Dec 17, 2020 | TBD | TBD | | 1.10 | 1.10.15 | EOL | Aug 27, 2018 | Dec 17, 2020 | June 17, 2021 | | 1.9 | 1.9.0 | EOL | Jan 03, 2018 | Aug 27, 2018 | Aug 27, 2018 | | 1.8 | 1.8.2 | EOL | Mar 19, 2017 | Jan 03, 2018 | Jan 03, 2018 | diff --git a/airflow/__init__.py b/airflow/__init__.py index 1e04b0048bb3b..818bec887bf71 100644 --- a/airflow/__init__.py +++ b/airflow/__init__.py @@ -17,7 +17,7 @@ # under the License. from __future__ import annotations -__version__ = "2.10.3" +__version__ = "2.10.4" import os import sys diff --git a/airflow/api_connexion/openapi/v1.yaml b/airflow/api_connexion/openapi/v1.yaml index ed350dd95d6fc..55a0b60c74d69 100644 --- a/airflow/api_connexion/openapi/v1.yaml +++ b/airflow/api_connexion/openapi/v1.yaml @@ -231,7 +231,7 @@ info: This means that the server encountered an unexpected condition that prevented it from fulfilling the request. - version: "2.10.3" + version: "2.10.4" license: name: Apache 2.0 url: http://www.apache.org/licenses/LICENSE-2.0.html diff --git a/docs/apache-airflow/installation/supported-versions.rst b/docs/apache-airflow/installation/supported-versions.rst index 006e3c509e789..c8fc9c8293ee1 100644 --- a/docs/apache-airflow/installation/supported-versions.rst +++ b/docs/apache-airflow/installation/supported-versions.rst @@ -29,7 +29,7 @@ Apache Airflow® version life cycle: ========= ===================== ========= =============== ================= ================ Version Current Patch/Minor State First Release Limited Support EOL/Terminated ========= ===================== ========= =============== ================= ================ -2 2.10.3 Supported Dec 17, 2020 TBD TBD +2 2.10.4 Supported Dec 17, 2020 TBD TBD 1.10 1.10.15 EOL Aug 27, 2018 Dec 17, 2020 June 17, 2021 1.9 1.9.0 EOL Jan 03, 2018 Aug 27, 2018 Aug 27, 2018 1.8 1.8.2 EOL Mar 19, 2017 Jan 03, 2018 Jan 03, 2018 diff --git a/docs/docker-stack/README.md b/docs/docker-stack/README.md index c56565c414573..63c7e80d343e0 100644 --- a/docs/docker-stack/README.md +++ b/docs/docker-stack/README.md @@ -31,12 +31,12 @@ Every time a new version of Airflow is released, the images are prepared in the [apache/airflow DockerHub](https://hub.docker.com/r/apache/airflow) for all the supported Python versions. -You can find the following images there (Assuming Airflow version `2.10.3`): +You can find the following images there (Assuming Airflow version `2.10.4`): * `apache/airflow:latest` - the latest released Airflow image with default Python version (3.8 currently) * `apache/airflow:latest-pythonX.Y` - the latest released Airflow image with specific Python version -* `apache/airflow:2.10.3` - the versioned Airflow image with default Python version (3.8 currently) -* `apache/airflow:2.10.3-pythonX.Y` - the versioned Airflow image with specific Python version +* `apache/airflow:2.10.4` - the versioned Airflow image with default Python version (3.8 currently) +* `apache/airflow:2.10.4-pythonX.Y` - the versioned Airflow image with specific Python version Those are "reference" regular images. They contain the most common set of extras, dependencies and providers that are often used by the users and they are good to "try-things-out" when you want to just take Airflow for a spin, @@ -47,8 +47,8 @@ via [Building the image](https://airflow.apache.org/docs/docker-stack/build.html * `apache/airflow:slim-latest` - the latest released Airflow image with default Python version (3.8 currently) * `apache/airflow:slim-latest-pythonX.Y` - the latest released Airflow image with specific Python version -* `apache/airflow:slim-2.10.3` - the versioned Airflow image with default Python version (3.8 currently) -* `apache/airflow:slim-2.10.3-pythonX.Y` - the versioned Airflow image with specific Python version +* `apache/airflow:slim-2.10.4` - the versioned Airflow image with default Python version (3.8 currently) +* `apache/airflow:slim-2.10.4-pythonX.Y` - the versioned Airflow image with specific Python version The Apache Airflow image provided as convenience package is optimized for size, and it provides just a bare minimal set of the extras and dependencies installed and in most cases diff --git a/docs/docker-stack/docker-examples/extending/add-airflow-configuration/Dockerfile b/docs/docker-stack/docker-examples/extending/add-airflow-configuration/Dockerfile index a697214dc1e1a..5fb16b7ced047 100644 --- a/docs/docker-stack/docker-examples/extending/add-airflow-configuration/Dockerfile +++ b/docs/docker-stack/docker-examples/extending/add-airflow-configuration/Dockerfile @@ -15,7 +15,7 @@ # This is an example Dockerfile. It is not intended for PRODUCTION use # [START Dockerfile] -FROM apache/airflow:2.10.3 +FROM apache/airflow:2.10.4 ENV AIRFLOW__CORE__LOAD_EXAMPLES=True ENV AIRFLOW__DATABASE__SQL_ALCHEMY_CONN=my_conn_string # [END Dockerfile] diff --git a/docs/docker-stack/docker-examples/extending/add-apt-packages/Dockerfile b/docs/docker-stack/docker-examples/extending/add-apt-packages/Dockerfile index b3c2a4f5eab40..72346ed959730 100644 --- a/docs/docker-stack/docker-examples/extending/add-apt-packages/Dockerfile +++ b/docs/docker-stack/docker-examples/extending/add-apt-packages/Dockerfile @@ -15,7 +15,7 @@ # This is an example Dockerfile. It is not intended for PRODUCTION use # [START Dockerfile] -FROM apache/airflow:2.10.3 +FROM apache/airflow:2.10.4 USER root RUN apt-get update \ && apt-get install -y --no-install-recommends \ diff --git a/docs/docker-stack/docker-examples/extending/add-build-essential-extend/Dockerfile b/docs/docker-stack/docker-examples/extending/add-build-essential-extend/Dockerfile index ab80a444e72a6..2bb166deb0ba0 100644 --- a/docs/docker-stack/docker-examples/extending/add-build-essential-extend/Dockerfile +++ b/docs/docker-stack/docker-examples/extending/add-build-essential-extend/Dockerfile @@ -15,7 +15,7 @@ # This is an example Dockerfile. It is not intended for PRODUCTION use # [START Dockerfile] -FROM apache/airflow:2.10.3 +FROM apache/airflow:2.10.4 USER root RUN apt-get update \ && apt-get install -y --no-install-recommends \ diff --git a/docs/docker-stack/docker-examples/extending/add-providers/Dockerfile b/docs/docker-stack/docker-examples/extending/add-providers/Dockerfile index 8e5a29a6fdcca..46584a972c3ef 100644 --- a/docs/docker-stack/docker-examples/extending/add-providers/Dockerfile +++ b/docs/docker-stack/docker-examples/extending/add-providers/Dockerfile @@ -15,7 +15,7 @@ # This is an example Dockerfile. It is not intended for PRODUCTION use # [START Dockerfile] -FROM apache/airflow:2.10.3 +FROM apache/airflow:2.10.4 USER root RUN apt-get update \ && apt-get install -y --no-install-recommends \ diff --git a/docs/docker-stack/docker-examples/extending/add-pypi-packages-constraints/Dockerfile b/docs/docker-stack/docker-examples/extending/add-pypi-packages-constraints/Dockerfile index d7f5931e91434..d0a73412945d0 100644 --- a/docs/docker-stack/docker-examples/extending/add-pypi-packages-constraints/Dockerfile +++ b/docs/docker-stack/docker-examples/extending/add-pypi-packages-constraints/Dockerfile @@ -15,6 +15,6 @@ # This is an example Dockerfile. It is not intended for PRODUCTION use # [START Dockerfile] -FROM apache/airflow:2.10.3 +FROM apache/airflow:2.10.4 RUN pip install --no-cache-dir "apache-airflow==${AIRFLOW_VERSION}" lxml --constraint "${HOME}/constraints.txt" # [END Dockerfile] diff --git a/docs/docker-stack/docker-examples/extending/add-pypi-packages-uv/Dockerfile b/docs/docker-stack/docker-examples/extending/add-pypi-packages-uv/Dockerfile index 7f4e049c907f6..06082308dc483 100644 --- a/docs/docker-stack/docker-examples/extending/add-pypi-packages-uv/Dockerfile +++ b/docs/docker-stack/docker-examples/extending/add-pypi-packages-uv/Dockerfile @@ -15,7 +15,7 @@ # This is an example Dockerfile. It is not intended for PRODUCTION use # [START Dockerfile] -FROM apache/airflow:2.10.3 +FROM apache/airflow:2.10.4 # The `uv` tools is Rust packaging tool that is much faster than `pip` and other installer # Support for uv as installation tool is experimental diff --git a/docs/docker-stack/docker-examples/extending/add-pypi-packages/Dockerfile b/docs/docker-stack/docker-examples/extending/add-pypi-packages/Dockerfile index 1a016a47e5e61..fe19bacf174cc 100644 --- a/docs/docker-stack/docker-examples/extending/add-pypi-packages/Dockerfile +++ b/docs/docker-stack/docker-examples/extending/add-pypi-packages/Dockerfile @@ -15,6 +15,6 @@ # This is an example Dockerfile. It is not intended for PRODUCTION use # [START Dockerfile] -FROM apache/airflow:2.10.3 +FROM apache/airflow:2.10.4 RUN pip install --no-cache-dir "apache-airflow==${AIRFLOW_VERSION}" lxml # [END Dockerfile] diff --git a/docs/docker-stack/docker-examples/extending/add-requirement-packages/Dockerfile b/docs/docker-stack/docker-examples/extending/add-requirement-packages/Dockerfile index b323e9512e7cc..ecbbf1984bbce 100644 --- a/docs/docker-stack/docker-examples/extending/add-requirement-packages/Dockerfile +++ b/docs/docker-stack/docker-examples/extending/add-requirement-packages/Dockerfile @@ -15,7 +15,7 @@ # This is an example Dockerfile. It is not intended for PRODUCTION use # [START Dockerfile] -FROM apache/airflow:2.10.3 +FROM apache/airflow:2.10.4 COPY requirements.txt / RUN pip install --no-cache-dir "apache-airflow==${AIRFLOW_VERSION}" -r /requirements.txt # [END Dockerfile] diff --git a/docs/docker-stack/docker-examples/extending/custom-providers/Dockerfile b/docs/docker-stack/docker-examples/extending/custom-providers/Dockerfile index a980559c4ba90..363debd15c43e 100644 --- a/docs/docker-stack/docker-examples/extending/custom-providers/Dockerfile +++ b/docs/docker-stack/docker-examples/extending/custom-providers/Dockerfile @@ -15,6 +15,6 @@ # This is an example Dockerfile. It is not intended for PRODUCTION use # [START Dockerfile] -FROM apache/airflow:2.10.3 +FROM apache/airflow:2.10.4 RUN pip install "apache-airflow==${AIRFLOW_VERSION}" --no-cache-dir apache-airflow-providers-docker==2.5.1 # [END Dockerfile] diff --git a/docs/docker-stack/docker-examples/extending/embedding-dags/Dockerfile b/docs/docker-stack/docker-examples/extending/embedding-dags/Dockerfile index 8adcc60ff7007..59f395d1728df 100644 --- a/docs/docker-stack/docker-examples/extending/embedding-dags/Dockerfile +++ b/docs/docker-stack/docker-examples/extending/embedding-dags/Dockerfile @@ -15,7 +15,7 @@ # This is an example Dockerfile. It is not intended for PRODUCTION use # [START Dockerfile] -FROM apache/airflow:2.10.3 +FROM apache/airflow:2.10.4 COPY --chown=airflow:root test_dag.py /opt/airflow/dags diff --git a/docs/docker-stack/docker-examples/extending/writable-directory/Dockerfile b/docs/docker-stack/docker-examples/extending/writable-directory/Dockerfile index 6a9f0afd9105e..7e3cee6464585 100644 --- a/docs/docker-stack/docker-examples/extending/writable-directory/Dockerfile +++ b/docs/docker-stack/docker-examples/extending/writable-directory/Dockerfile @@ -15,7 +15,7 @@ # This is an example Dockerfile. It is not intended for PRODUCTION use # [START Dockerfile] -FROM apache/airflow:2.10.3 +FROM apache/airflow:2.10.4 RUN umask 0002; \ mkdir -p ~/writeable-directory # [END Dockerfile] diff --git a/docs/docker-stack/entrypoint.rst b/docs/docker-stack/entrypoint.rst index 30a6e769dbffc..5c0d0d0a432a6 100644 --- a/docs/docker-stack/entrypoint.rst +++ b/docs/docker-stack/entrypoint.rst @@ -132,7 +132,7 @@ if you specify extra arguments. For example: .. code-block:: bash - docker run -it apache/airflow:2.10.3-python3.8 bash -c "ls -la" + docker run -it apache/airflow:2.10.4-python3.8 bash -c "ls -la" total 16 drwxr-xr-x 4 airflow root 4096 Jun 5 18:12 . drwxr-xr-x 1 root root 4096 Jun 5 18:12 .. @@ -144,7 +144,7 @@ you pass extra parameters. For example: .. code-block:: bash - > docker run -it apache/airflow:2.10.3-python3.8 python -c "print('test')" + > docker run -it apache/airflow:2.10.4-python3.8 python -c "print('test')" test If first argument equals to "airflow" - the rest of the arguments is treated as an airflow command @@ -152,13 +152,13 @@ to execute. Example: .. code-block:: bash - docker run -it apache/airflow:2.10.3-python3.8 airflow webserver + docker run -it apache/airflow:2.10.4-python3.8 airflow webserver If there are any other arguments - they are simply passed to the "airflow" command .. code-block:: bash - > docker run -it apache/airflow:2.10.3-python3.8 help + > docker run -it apache/airflow:2.10.4-python3.8 help usage: airflow [-h] GROUP_OR_COMMAND ... positional arguments: @@ -363,7 +363,7 @@ database and creating an ``admin/admin`` Admin user with the following command: --env "_AIRFLOW_DB_MIGRATE=true" \ --env "_AIRFLOW_WWW_USER_CREATE=true" \ --env "_AIRFLOW_WWW_USER_PASSWORD=admin" \ - apache/airflow:2.10.3-python3.8 webserver + apache/airflow:2.10.4-python3.8 webserver .. code-block:: bash @@ -372,7 +372,7 @@ database and creating an ``admin/admin`` Admin user with the following command: --env "_AIRFLOW_DB_MIGRATE=true" \ --env "_AIRFLOW_WWW_USER_CREATE=true" \ --env "_AIRFLOW_WWW_USER_PASSWORD_CMD=echo admin" \ - apache/airflow:2.10.3-python3.8 webserver + apache/airflow:2.10.4-python3.8 webserver The commands above perform initialization of the SQLite database, create admin user with admin password and Admin role. They also forward local port ``8080`` to the webserver port and finally start the webserver. @@ -412,6 +412,6 @@ Example: --env "_AIRFLOW_DB_MIGRATE=true" \ --env "_AIRFLOW_WWW_USER_CREATE=true" \ --env "_AIRFLOW_WWW_USER_PASSWORD_CMD=echo admin" \ - apache/airflow:2.10.3-python3.8 webserver + apache/airflow:2.10.4-python3.8 webserver This method is only available starting from Docker image of Airflow 2.1.1 and above. diff --git a/generated/PYPI_README.md b/generated/PYPI_README.md index a97566f4b68ee..aa59ed15c576b 100644 --- a/generated/PYPI_README.md +++ b/generated/PYPI_README.md @@ -54,9 +54,9 @@ Use Airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The Apache Airflow is tested with: -| | Main version (dev) | Stable version (2.10.3) | +| | Main version (dev) | Stable version (2.10.4) | |-------------|------------------------------|------------------------------| -| Python | 3.8, 3.9, 3.10, 3.11, 3.12 | 3.8, 3.9, 3.10, 3.11, 3.12 | +| Python | 3.9, 3.10, 3.11, 3.12 | 3.8, 3.9, 3.10, 3.11, 3.12 | | Platform | AMD64/ARM64(\*) | AMD64/ARM64(\*) | | Kubernetes | 1.26, 1.27, 1.28, 1.29, 1.30 | 1.26, 1.27, 1.28, 1.29, 1.30 | | PostgreSQL | 12, 13, 14, 15, 16 | 12, 13, 14, 15, 16 | @@ -128,15 +128,15 @@ them to the appropriate format and workflow that your tool requires. ```bash -pip install 'apache-airflow==2.10.3' \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.10.3/constraints-3.8.txt" +pip install 'apache-airflow==2.10.4' \ + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.10.4/constraints-3.8.txt" ``` 2. Installing with extras (i.e., postgres, google) ```bash pip install 'apache-airflow[postgres,google]==2.8.3' \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.10.3/constraints-3.8.txt" + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.10.4/constraints-3.8.txt" ``` For information on installing provider packages, check diff --git a/scripts/ci/pre_commit/supported_versions.py b/scripts/ci/pre_commit/supported_versions.py index 93e5245d98012..8524f237dc993 100755 --- a/scripts/ci/pre_commit/supported_versions.py +++ b/scripts/ci/pre_commit/supported_versions.py @@ -27,7 +27,7 @@ HEADERS = ("Version", "Current Patch/Minor", "State", "First Release", "Limited Support", "EOL/Terminated") SUPPORTED_VERSIONS = ( - ("2", "2.10.3", "Supported", "Dec 17, 2020", "TBD", "TBD"), + ("2", "2.10.4", "Supported", "Dec 17, 2020", "TBD", "TBD"), ("1.10", "1.10.15", "EOL", "Aug 27, 2018", "Dec 17, 2020", "June 17, 2021"), ("1.9", "1.9.0", "EOL", "Jan 03, 2018", "Aug 27, 2018", "Aug 27, 2018"), ("1.8", "1.8.2", "EOL", "Mar 19, 2017", "Jan 03, 2018", "Jan 03, 2018"), From c083e456fa02c6cb32cdbe0c9ed3c3b2380beccd Mon Sep 17 00:00:00 2001 From: utkarsh sharma Date: Wed, 4 Dec 2024 15:34:11 +0530 Subject: [PATCH 44/44] Update RELEASE_NOTES.rst --- RELEASE_NOTES.rst | 55 ++++++++++++++++++++++++++++- airflow/reproducible_build.yaml | 4 +-- newsfragments/43191.improvement.rst | 1 - newsfragments/43611.significant.rst | 6 ---- newsfragments/44300.bugfix.rst | 1 - 5 files changed, 56 insertions(+), 11 deletions(-) delete mode 100644 newsfragments/43191.improvement.rst delete mode 100644 newsfragments/43611.significant.rst delete mode 100644 newsfragments/44300.bugfix.rst diff --git a/RELEASE_NOTES.rst b/RELEASE_NOTES.rst index 0d27630b70800..146c661c84ffa 100644 --- a/RELEASE_NOTES.rst +++ b/RELEASE_NOTES.rst @@ -21,13 +21,65 @@ .. towncrier release notes start +Airflow 2.10.4 (2024-12-09) +--------------------------- + +Significant Changes +^^^^^^^^^^^^^^^^^^^ + +TaskInstance ``priority_weight`` is capped in 32-bit signed integer ranges (#43611) +""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""" + +Some database engines are limited to 32-bit integer values. As some users reported errors in +weight rolled-over to negative values, we decided to cap the value to the 32-bit integer. Even +if internally in python smaller or larger values to 64 bit are supported, ``priority_weight`` is +capped and only storing values from -2147483648 to 2147483647. + +Bug Fixes +^^^^^^^^^ + +- Fix stats of dynamic mapped tasks after automatic retries of failed tasks (#44300) +- Fix wrong display of multi-line messages in the log after filtering (#44457) +- Allow "/" in metrics validator (#42934) (#44515) +- Fix gantt flickering (#44488) (#44517) +- Fix problem with inability to remove fields from Connection form (#40421) (#44442) +- Check pool_slots on partial task import instead of execution (#39724) (#42693) +- Avoid grouping task instance stats by try_number for dynamic mapped tasks (#44300) (#44319) +- Re-queue task when they are stuck in queued (#43520) (#44158) +- Suppress the warnings where we check for sensitive values (#44148) (#44167) +- Fix get_task_instance_try_details to return appropriate schema (#43830) (#44133) +- Log message source details are grouped (#43681) (#44070) +- Fix duplication of Task tries in the UI (#43891) (#43950) +- Add correct mime-type in OpenAPI spec (#43879) (#43901) +- Disable extra links button if link is null or empty (#43844) (#43851) +- Disable XCom list ordering by execution_date (#43680) (#43696) +- Fix venv numpy example which needs to be 1.26 at least to be working in Python 3.12 (#43659) +- Fix Try Selector in Mapped Tasks also on Index 0 (#43590) (#43591) +- Prevent using ``trigger_rule="always"`` in a dynamic mapped task (#43810) +- Prevent using ``trigger_rule=TriggerRule.ALWAYS`` in a task-generated mapping within bare tasks (#44751) + +Doc Only Changes +"""""""""""""""" +- Update XCom docs around containers/helm (#44570) (#44573) + +Miscellaneous +""""""""""""" +- Raise deprecation warning when accessing inlet or outlet events through str (#43922) + + Airflow 2.10.3 (2024-11-04) --------------------------- Significant Changes ^^^^^^^^^^^^^^^^^^^ -No significant changes. +Enhancing BashOperator to Execute Templated Bash Scripts as Temporary Files (44641) +""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""" + +Bash script files (``.sh`` and ``.bash``) with Jinja templating enabled (without the space after the file +extension) are now rendered into a temporary file, and then executed. Instead of being directly executed +as inline command. + Bug Fixes """"""""" @@ -62,6 +114,7 @@ Bug Fixes - Ensure total_entries in /api/v1/dags (#43377) (#43429) - Include limit and offset in request body schema for List task instances (batch) endpoint (#43479) - Don't raise a warning in ExecutorSafeguard when execute is called from an extended operator (#42849) (#43577) +- Double-check TaskInstance state if it differs from the Executor state.(#43063) Miscellaneous """"""""""""" diff --git a/airflow/reproducible_build.yaml b/airflow/reproducible_build.yaml index 921cef1c89adf..253e4e793cff1 100644 --- a/airflow/reproducible_build.yaml +++ b/airflow/reproducible_build.yaml @@ -1,2 +1,2 @@ -release-notes-hash: 6aa54b840e9fc2e48cf7046507e6930b -source-date-epoch: 1730460817 +release-notes-hash: 0867869dba7304e7ead28dd0800c5c4b +source-date-epoch: 1733822937 diff --git a/newsfragments/43191.improvement.rst b/newsfragments/43191.improvement.rst deleted file mode 100644 index eb6a2181bd06f..0000000000000 --- a/newsfragments/43191.improvement.rst +++ /dev/null @@ -1 +0,0 @@ -Bash script files (``.sh`` and ``.bash``) with Jinja templating enabled (without the space after the file extension) are now rendered into a temporary file, and then executed. Instead of being directly executed as inline command. diff --git a/newsfragments/43611.significant.rst b/newsfragments/43611.significant.rst deleted file mode 100644 index e25fb2a5bba4b..0000000000000 --- a/newsfragments/43611.significant.rst +++ /dev/null @@ -1,6 +0,0 @@ -TaskInstance ``priority_weight`` is capped in 32-bit signed integer ranges. - -Some database engines are limited to 32-bit integer values. As some users reported errors in -weight rolled-over to negative values, we decided to cap the value to the 32-bit integer. Even -if internally in python smaller or larger values to 64 bit are supported, ``priority_weight`` is -capped and only storing values from -2147483648 to 2147483647. diff --git a/newsfragments/44300.bugfix.rst b/newsfragments/44300.bugfix.rst deleted file mode 100644 index ffd4b07b2ab0d..0000000000000 --- a/newsfragments/44300.bugfix.rst +++ /dev/null @@ -1 +0,0 @@ -Fix stats of dynamic mapped tasks after automatic retries of failed tasks