-
Notifications
You must be signed in to change notification settings - Fork 16.4k
Description
Apache Airflow version
2.1.3 (latest released)
Operating System
Debian GNU/Linux 10 (buster)
Versions of Apache Airflow Providers
apache-airflow-providers-amazon==2.1.0
apache-airflow-providers-celery==2.0.0
apache-airflow-providers-cncf-kubernetes==2.0.2
apache-airflow-providers-docker==2.1.0
apache-airflow-providers-elasticsearch==2.0.2
apache-airflow-providers-ftp==2.0.0
apache-airflow-providers-google==5.0.0
apache-airflow-providers-grpc==2.0.0
apache-airflow-providers-hashicorp==2.0.0
apache-airflow-providers-http==2.0.0
apache-airflow-providers-imap==2.0.0
apache-airflow-providers-microsoft-azure==3.1.0
apache-airflow-providers-mysql==2.1.0
apache-airflow-providers-postgres==2.0.0
apache-airflow-providers-redis==2.0.0
apache-airflow-providers-sendgrid==2.0.0
apache-airflow-providers-sftp==2.1.0
apache-airflow-providers-slack==4.0.0
apache-airflow-providers-sqlite==2.0.0
apache-airflow-providers-ssh==2.1.0
Deployment
Other Docker-based deployment
Deployment details
Base docker image- apache/airflow:2.1.3-python3.7
What happened
Last Runs for the DAGs on the listing page are not loading because the /last_dagruns endpoint is returning 500 error.
What you expected to happen
The Last Run for the DAGs should show the latest execution date of each DAG
How to reproduce
- Open the DAG listing page
- Look at the
Last Runcolumn, which would continuously show the loader for each DAG - If you open the Network, you would observe a 500 error for
/last_dagrunsendpoint
Anything else
No response
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project's Code of Conduct