diff --git a/airflow-core/docs/best-practices.rst b/airflow-core/docs/best-practices.rst index 268f3e7150f63..28c3285339ac2 100644 --- a/airflow-core/docs/best-practices.rst +++ b/airflow-core/docs/best-practices.rst @@ -296,8 +296,6 @@ When you execute that code you will see: This means that the ``get_array`` is not executed as top-level code, but ``get_task_id`` is. -.. _best_practices/dynamic_dag_generation: - Code Quality and Linting ------------------------ @@ -351,6 +349,7 @@ By integrating ``ruff`` into your development workflow, you can proactively addr For more information on ``ruff`` and its integration with Airflow, refer to the `official Airflow documentation `_. +.. _best_practices/dynamic_dag_generation: Dynamic DAG Generation ---------------------- diff --git a/airflow-core/docs/howto/docker-compose/index.rst b/airflow-core/docs/howto/docker-compose/index.rst index df46066a0bc0a..0d5e2a22bb62a 100644 --- a/airflow-core/docs/howto/docker-compose/index.rst +++ b/airflow-core/docs/howto/docker-compose/index.rst @@ -307,11 +307,13 @@ Examples of how you can extend the image with custom providers, python packages, apt packages and more can be found in :doc:`Building the image `. .. note:: - Creating custom images means that you need to maintain also a level of automation as you need to re-create the images - when either the packages you want to install or Airflow is upgraded. Please do not forget about keeping these scripts. - Also keep in mind, that in cases when you run pure Python tasks, you can use the - `Python Virtualenv functions <_howto/operator:PythonVirtualenvOperator>`_ which will - dynamically source and install python dependencies during runtime. With Airflow 2.8.0 Virtualenvs can also be cached. + Creating custom images means that you need to maintain also a level of + automation as you need to re-create the images when either the packages you + want to install or Airflow is upgraded. Please do not forget about keeping + these scripts. Also keep in mind, that in cases when you run pure Python + tasks, you can use :ref:`Python Virtualenv functions `, + which will dynamically source and install python dependencies during runtime. + With Airflow 2.8.0, virtualenvs can also be cached. Special case - adding dependencies via requirements.txt file ============================================================ diff --git a/airflow-core/docs/howto/dynamic-dag-generation.rst b/airflow-core/docs/howto/dynamic-dag-generation.rst index 814b620ea719b..734e89f5d805d 100644 --- a/airflow-core/docs/howto/dynamic-dag-generation.rst +++ b/airflow-core/docs/howto/dynamic-dag-generation.rst @@ -40,7 +40,8 @@ If you want to use variables to configure your code, you should always use `environment variables `_ in your top-level code rather than :doc:`Airflow Variables `. Using Airflow Variables in top-level code creates a connection to the metadata DB of Airflow to fetch the value, which can slow -down parsing and place extra load on the DB. See the `best practices on Airflow Variables `_ +down parsing and place extra load on the DB. See +:ref:`best practices on Airflow Variables ` to make the best use of Airflow Variables in your dags using Jinja templates. For example you could set ``DEPLOYMENT`` variable differently for your production and development diff --git a/airflow-core/docs/installation/upgrading_to_airflow3.rst b/airflow-core/docs/installation/upgrading_to_airflow3.rst index f8f932f54765b..3f9b3e399d4e7 100644 --- a/airflow-core/docs/installation/upgrading_to_airflow3.rst +++ b/airflow-core/docs/installation/upgrading_to_airflow3.rst @@ -71,7 +71,7 @@ Some changes can be automatically fixed. To do so, run the following command: ruff check dag/ --select AIR301 --fix --preview -You can also configure these flags through configuration files. See `Configuring Ruff `_ for details. +You can also configure these flags through configuration files. See `Configuring Ruff `_ for details. Step 4: Install the Standard Providers -------------------------------------- diff --git a/airflow-core/docs/public-airflow-interface.rst b/airflow-core/docs/public-airflow-interface.rst index 337413ce802a7..88eed605eefa4 100644 --- a/airflow-core/docs/public-airflow-interface.rst +++ b/airflow-core/docs/public-airflow-interface.rst @@ -46,9 +46,9 @@ MAJOR version of Airflow. On the other hand, classes and methods starting with ` as protected Python methods) and ``__`` (also known as private Python methods) are not part of the Public Airflow Interface and might change at any time. -You can also use Airflow's Public Interface via the `Stable REST API `_ (based on the +You can also use Airflow's Public Interface via the :doc:`Stable REST API ` (based on the OpenAPI specification). For specific needs you can also use the -`Airflow Command Line Interface (CLI) `_ though its behaviour might change +:doc:`Airflow Command Line Interface (CLI) ` though its behaviour might change in details (such as output format and available flags) so if you want to rely on those in programmatic way, the Stable REST API is recommended. @@ -407,11 +407,12 @@ Everything not mentioned in this document should be considered as non-Public Int Sometimes in other applications those components could be relied on to keep backwards compatibility, but in Airflow they are not parts of the Public Interface and might change any time: -* `Database structure `_ is considered to be an internal implementation +* :doc:`Database structure ` is considered to be an internal implementation detail and you should not assume the structure is going to be maintained in a backwards-compatible way. -* `Web UI `_ is continuously evolving and there are no backwards compatibility guarantees on HTML elements. +* :doc:`Web UI ` is continuously evolving and there are no backwards + compatibility guarantees on HTML elements. * Python classes except those explicitly mentioned in this document, are considered an internal implementation detail and you should not assume they will be maintained diff --git a/chart/docs/index.rst b/chart/docs/index.rst index 3a9d1ad4e4620..77ba613e66ed4 100644 --- a/chart/docs/index.rst +++ b/chart/docs/index.rst @@ -81,6 +81,8 @@ Features * Kerberos secure configuration * One-command deployment for any type of executor. You don't need to provide other services e.g. Redis/Database to test the Airflow. +.. _helm_chart_install: + Installing the Chart -------------------- diff --git a/chart/docs/installing-helm-chart-from-sources.rst b/chart/docs/installing-helm-chart-from-sources.rst index 15af0a4487b3f..66b9a912a688f 100644 --- a/chart/docs/installing-helm-chart-from-sources.rst +++ b/chart/docs/installing-helm-chart-from-sources.rst @@ -16,7 +16,7 @@ under the License. Installing Helm Chart from sources ----------------------------------- +================================== Released packages ''''''''''''''''' @@ -24,9 +24,8 @@ Released packages .. jinja:: official_download_page This page describes downloading and verifying ``Apache Airflow Official Helm Chart`` version - ``{{ package_version}}`` using officially released source packages. You can also install the chart - directly from the ``airflow.apache.org`` repo as described in - `Installing the chart `_. + ``{{ package_version }}`` using officially released source packages. You can also install the chart + directly from the ``airflow.apache.org`` repo as described in :ref:`helm_chart_install`. You can choose different version of the chart by selecting different version from the drop-down at the top-left of the page. diff --git a/providers/amazon/docs/operators/athena/index.rst b/providers/amazon/docs/operators/athena/index.rst index 85130aba62e2c..8641cd715e07e 100644 --- a/providers/amazon/docs/operators/athena/index.rst +++ b/providers/amazon/docs/operators/athena/index.rst @@ -38,7 +38,8 @@ Airflow offers two ways to query data using Amazon Athena. **Amazon Athena SQL (DB API Connection):** Opt for this if you need to execute multiple queries in the same operator and it's essential to retrieve and process query results directly in Airflow, such as for sensing values or further data manipulation. .. note:: - Both connection methods uses `Amazon Web Services Connection <../../connections/aws>`_ under the hood for authentication. + Both connection methods uses :doc:`Amazon Web Services Connection <../../connections/aws>` + under the hood for authentication. You should decide which connection method to use based on your use case. .. toctree::