Skip to content

Conversation

@potiuk
Copy link
Member

@potiuk potiuk commented May 22, 2025

The celery tests hang intermittently from time to time, and it's rather difficult to pin-point the root cause. This PR attempts to isolate the tests and make them fail faster in case the problem happens.

Currently, after some recent refactoring - none of the tests usually run longer that 18-19 minutes, so we can set much lower timeouts for the test job - 30 minutes "soft" timeout (SIGTERM sent to stop the container and dump logs) and 35 minutes for "hard" failure of GitHub Action.

If we see that we are still hanging despite the isolation, we can later introduce more debug logging for just the celery container run.


^ Add meaningful description above
Read the Pull Request Guidelines for more information.
In case of fundamental code changes, an Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in a newsfragment file, named {pr_number}.significant.rst or {issue_number}.significant.rst, in airflow-core/newsfragments.

The celery tests hang intermittently from time to time, and it's
rather difficult to pin-point the root cause. This PR attempts to
isolate the tests and make them fail faster in case the problem
happens.

Currently, after some recent refactoring - none of the tests usually
run longer that 18-19 minutes, so we can set much lower timeouts
for the test job - 30 minutes "soft" timeout (SIGTERM sent to
stop the container and dump logs) and 35 minutes for "hard" failure
of GitHub Action.

If we see that we are still hanging despite the isolation, we
can later introduce more debug logging for **just** the celery
container run.
@potiuk potiuk added this to the Airflow 3.0.2 milestone May 22, 2025
@potiuk potiuk added the backport-to-v3-1-test Mark PR with this label to backport to v3-1-test branch label May 22, 2025
Copy link
Contributor

@amoghrajesh amoghrajesh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah lets isolate and try to fix it

Copy link
Member

@gopidesupavan gopidesupavan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cool, hope to see celery happy now :)

@potiuk
Copy link
Member Author

potiuk commented May 22, 2025

cool, hope to see celery happy now :)

I am really hopeful here. I've looked trough about 40 failed scheduled runs and I never saw the celery issue when running "lowest deps" tests - it seems to only fail when we run the regular tests together with other providers, so it seems that it is reallly side-effect of some other tests causing it - hopefully when we run it in isolation the problem will be gone (but it might be that the side effect will affect other tests ... who knows..

@potiuk potiuk merged commit fb8c877 into apache:main May 22, 2025
98 checks passed
@potiuk potiuk deleted the isolate-celery-tests-to-separate-container branch May 22, 2025 12:34
@potiuk
Copy link
Member Author

potiuk commented May 22, 2025

Ok. So far - so good first pass was full-green. Which is a good sign.

@github-actions
Copy link

Backport failed to create: v3-0-test. View the failure log Run details

Status Branch Result
v3-0-test Commit Link

You can attempt to backport this manually by running:

cherry_picker fb8c877 v3-0-test

This should apply the commit to the v3-0-test branch and leave the commit in conflict state marking
the files that need manual conflict resolution.

After you have resolved the conflicts, you can continue the backport process by running:

cherry_picker --continue

potiuk added a commit to potiuk/airflow that referenced this pull request May 22, 2025
The celery tests hang intermittently from time to time, and it's
rather difficult to pin-point the root cause. This PR attempts to
isolate the tests and make them fail faster in case the problem
happens.

Currently, after some recent refactoring - none of the tests usually
run longer that 18-19 minutes, so we can set much lower timeouts
for the test job - 30 minutes "soft" timeout (SIGTERM sent to
stop the container and dump logs) and 35 minutes for "hard" failure
of GitHub Action.

If we see that we are still hanging despite the isolation, we
can later introduce more debug logging for **just** the celery
container run.
(cherry picked from commit fb8c877)

Co-authored-by: Jarek Potiuk <jarek@potiuk.com>
potiuk added a commit that referenced this pull request May 22, 2025
The celery tests hang intermittently from time to time, and it's
rather difficult to pin-point the root cause. This PR attempts to
isolate the tests and make them fail faster in case the problem
happens.

Currently, after some recent refactoring - none of the tests usually
run longer that 18-19 minutes, so we can set much lower timeouts
for the test job - 30 minutes "soft" timeout (SIGTERM sent to
stop the container and dump logs) and 35 minutes for "hard" failure
of GitHub Action.

If we see that we are still hanging despite the isolation, we
can later introduce more debug logging for **just** the celery
container run.
(cherry picked from commit fb8c877)
dadonnelly316 pushed a commit to dadonnelly316/airflow that referenced this pull request May 26, 2025
The celery tests hang intermittently from time to time, and it's
rather difficult to pin-point the root cause. This PR attempts to
isolate the tests and make them fail faster in case the problem
happens.

Currently, after some recent refactoring - none of the tests usually
run longer that 18-19 minutes, so we can set much lower timeouts
for the test job - 30 minutes "soft" timeout (SIGTERM sent to
stop the container and dump logs) and 35 minutes for "hard" failure
of GitHub Action.

If we see that we are still hanging despite the isolation, we
can later introduce more debug logging for **just** the celery
container run.
kaxil pushed a commit that referenced this pull request Jun 3, 2025
The celery tests hang intermittently from time to time, and it's
rather difficult to pin-point the root cause. This PR attempts to
isolate the tests and make them fail faster in case the problem
happens.

Currently, after some recent refactoring - none of the tests usually
run longer that 18-19 minutes, so we can set much lower timeouts
for the test job - 30 minutes "soft" timeout (SIGTERM sent to
stop the container and dump logs) and 35 minutes for "hard" failure
of GitHub Action.

If we see that we are still hanging despite the isolation, we
can later introduce more debug logging for **just** the celery
container run.
(cherry picked from commit fb8c877)
sanederchik pushed a commit to sanederchik/airflow that referenced this pull request Jun 7, 2025
The celery tests hang intermittently from time to time, and it's
rather difficult to pin-point the root cause. This PR attempts to
isolate the tests and make them fail faster in case the problem
happens.

Currently, after some recent refactoring - none of the tests usually
run longer that 18-19 minutes, so we can set much lower timeouts
for the test job - 30 minutes "soft" timeout (SIGTERM sent to
stop the container and dump logs) and 35 minutes for "hard" failure
of GitHub Action.

If we see that we are still hanging despite the isolation, we
can later introduce more debug logging for **just** the celery
container run.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area:dev-tools backport-to-v3-1-test Mark PR with this label to backport to v3-1-test branch

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants