diff --git a/airflow-core/docs/howto/docker-compose/index.rst b/airflow-core/docs/howto/docker-compose/index.rst index df46066a0bc0a..6d160819d2a93 100644 --- a/airflow-core/docs/howto/docker-compose/index.rst +++ b/airflow-core/docs/howto/docker-compose/index.rst @@ -310,7 +310,7 @@ apt packages and more can be found in :doc:`Building the image `_ which will + :ref:`Python Virtualenv functions <_howto/operator:PythonVirtualenvOperator>` which will dynamically source and install python dependencies during runtime. With Airflow 2.8.0 Virtualenvs can also be cached. Special case - adding dependencies via requirements.txt file diff --git a/airflow-core/docs/howto/dynamic-dag-generation.rst b/airflow-core/docs/howto/dynamic-dag-generation.rst index 814b620ea719b..57f5705002033 100644 --- a/airflow-core/docs/howto/dynamic-dag-generation.rst +++ b/airflow-core/docs/howto/dynamic-dag-generation.rst @@ -40,7 +40,7 @@ If you want to use variables to configure your code, you should always use `environment variables `_ in your top-level code rather than :doc:`Airflow Variables `. Using Airflow Variables in top-level code creates a connection to the metadata DB of Airflow to fetch the value, which can slow -down parsing and place extra load on the DB. See the `best practices on Airflow Variables `_ +down parsing and place extra load on the DB. See the :ref:`best practices on Airflow Variables ` to make the best use of Airflow Variables in your dags using Jinja templates. For example you could set ``DEPLOYMENT`` variable differently for your production and development diff --git a/airflow-core/docs/installation/upgrading_to_airflow3.rst b/airflow-core/docs/installation/upgrading_to_airflow3.rst index f8f932f54765b..36d118cb56a53 100644 --- a/airflow-core/docs/installation/upgrading_to_airflow3.rst +++ b/airflow-core/docs/installation/upgrading_to_airflow3.rst @@ -71,7 +71,7 @@ Some changes can be automatically fixed. To do so, run the following command: ruff check dag/ --select AIR301 --fix --preview -You can also configure these flags through configuration files. See `Configuring Ruff `_ for details. +You can also configure these flags through configuration files. See :doc:`Configuring Ruff ` for details. Step 4: Install the Standard Providers -------------------------------------- diff --git a/airflow-core/docs/public-airflow-interface.rst b/airflow-core/docs/public-airflow-interface.rst index 337413ce802a7..5f924305c4f38 100644 --- a/airflow-core/docs/public-airflow-interface.rst +++ b/airflow-core/docs/public-airflow-interface.rst @@ -46,9 +46,9 @@ MAJOR version of Airflow. On the other hand, classes and methods starting with ` as protected Python methods) and ``__`` (also known as private Python methods) are not part of the Public Airflow Interface and might change at any time. -You can also use Airflow's Public Interface via the `Stable REST API `_ (based on the +You can also use Airflow's Public Interface via the :doc:`Stable REST API `_ (based on the OpenAPI specification). For specific needs you can also use the -`Airflow Command Line Interface (CLI) `_ though its behaviour might change +:doc:`Airflow Command Line Interface (CLI) ` though its behaviour might change in details (such as output format and available flags) so if you want to rely on those in programmatic way, the Stable REST API is recommended. @@ -407,11 +407,11 @@ Everything not mentioned in this document should be considered as non-Public Int Sometimes in other applications those components could be relied on to keep backwards compatibility, but in Airflow they are not parts of the Public Interface and might change any time: -* `Database structure `_ is considered to be an internal implementation +* :doc:`Database structure ` is considered to be an internal implementation detail and you should not assume the structure is going to be maintained in a backwards-compatible way. -* `Web UI `_ is continuously evolving and there are no backwards compatibility guarantees on HTML elements. +* :doc:`Web UI ` is continuously evolving and there are no backwards compatibility guarantees on HTML elements. * Python classes except those explicitly mentioned in this document, are considered an internal implementation detail and you should not assume they will be maintained diff --git a/chart/docs/installing-helm-chart-from-sources.rst b/chart/docs/installing-helm-chart-from-sources.rst index 15af0a4487b3f..f80a00b624c79 100644 --- a/chart/docs/installing-helm-chart-from-sources.rst +++ b/chart/docs/installing-helm-chart-from-sources.rst @@ -26,7 +26,7 @@ Released packages This page describes downloading and verifying ``Apache Airflow Official Helm Chart`` version ``{{ package_version}}`` using officially released source packages. You can also install the chart directly from the ``airflow.apache.org`` repo as described in - `Installing the chart `_. + :doc:`Installing the chart `. You can choose different version of the chart by selecting different version from the drop-down at the top-left of the page. diff --git a/providers/amazon/docs/operators/athena/index.rst b/providers/amazon/docs/operators/athena/index.rst index 85130aba62e2c..13b8b2780bf2d 100644 --- a/providers/amazon/docs/operators/athena/index.rst +++ b/providers/amazon/docs/operators/athena/index.rst @@ -38,7 +38,7 @@ Airflow offers two ways to query data using Amazon Athena. **Amazon Athena SQL (DB API Connection):** Opt for this if you need to execute multiple queries in the same operator and it's essential to retrieve and process query results directly in Airflow, such as for sensing values or further data manipulation. .. note:: - Both connection methods uses `Amazon Web Services Connection <../../connections/aws>`_ under the hood for authentication. + Both connection methods uses :doc:`Amazon Web Services Connection <../../connections/aws>` under the hood for authentication. You should decide which connection method to use based on your use case. .. toctree::