Skip to content

Fix issue #39191 respect accelerate config to disable torch.dynamo compilation#39683

Open
bonpiedlaroute wants to merge 5 commits intohuggingface:mainfrom
bonpiedlaroute:fix/respect-accelerate-dynamo-config
Open

Fix issue #39191 respect accelerate config to disable torch.dynamo compilation#39683
bonpiedlaroute wants to merge 5 commits intohuggingface:mainfrom
bonpiedlaroute:fix/respect-accelerate-dynamo-config

Conversation

@bonpiedlaroute
Copy link
Copy Markdown

@bonpiedlaroute bonpiedlaroute commented Jul 25, 2025

Description

Fixes #39191 where transformers ignored accelerate configuration to disable torch.dynamo, leading to unexpected compilation and FailOnRecompileLimitHit errors.

Problem

When users configure accelerate to disable torch.dynamo, transformers' automatic compilation in _valid_auto_compile_criteria() was not respecting this setting, causing:

  • Unwanted torch.compile activation
  • Excessive recompilations
  • FailOnRecompileLimitHit crashes in distributed training scenarios

Solution

  • Added _is_dynamo_compilation_disabled() method that checks standard environment variables
  • Modified _valid_auto_compile_criteria() to respect these environment variables before enabling compilation

Environment Variables Supported

  • TORCHDYNAMO_DISABLE=1

Tests done

  • Compilation disabled when env var set
  • Normal behavior preserved when no env var
  • Backward compatible - no breaking changes

Usage

Users experiencing the issue can now use:

export TORCHDYNAMO_DISABLE=1
python training_script.py

Who can review?

@SunMarc @gante @zach-huggingface and @qgallouedec

Copy link
Copy Markdown
Contributor

@gante gante left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for opening the PR!

LGTM, added a question/small request to clear out before merging :)

Comment thread src/transformers/generation/utils.py Outdated
disable_env_vars = [
"ACCELERATE_DISABLE_TORCH_DYNAMO", # Accelerate-specific
"TORCH_DYNAMO_DISABLE", # Pytorch standard
"DISABLE_TORCH_DYNAMO", # Common alternative
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not a standard env var, correct? If not, then let's remove it 🤗 (let's stick to the known flags and avoid adding more noise, which may have unintended consequences)

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're absolutely right! After double checking pytorch source code, the official flag is: TORCHDYNAMO_DISABLE
PR updated.
Thanks,
Noël

@bonpiedlaroute bonpiedlaroute force-pushed the fix/respect-accelerate-dynamo-config branch from a6943a8 to 4872958 Compare August 6, 2025 10:13
@bonpiedlaroute
Copy link
Copy Markdown
Author

Please note that the failing test tests/models/perceiver/test_modeling_perceiver.py::PerceiverModelTest::test_pipeline_image_classification_fp16 - PIL.UnidentifiedImageError: cannot identify image file
is unrelated to this change. It seems to be an image file corruption or download issue, no link with torch.dynamo compilation fix.
All other pipeline tests (148) passed successfully
Thanks,
Noël

@SunMarc
Copy link
Copy Markdown
Member

SunMarc commented Aug 6, 2025

Please note that the failing test tests/models/perceiver/test_modeling_perceiver.py::PerceiverModelTest::test_pipeline_image_classification_fp16 - PIL.UnidentifiedImageError: cannot identify image file
is unrelated to this change. It seems to be an image file corruption or download issue, no link with torch.dynamo compilation fix.

Yeah don't worry ! I'll rerun the CI to merge this

Copy link
Copy Markdown
Member

@SunMarc SunMarc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks !

@SunMarc SunMarc enabled auto-merge (squash) August 6, 2025 13:05
@HuggingFaceDocBuilderDev
Copy link
Copy Markdown

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

auto-merge was automatically disabled January 14, 2026 15:56

Head branch was pushed to by a user without write access

@SunMarc
Copy link
Copy Markdown
Member

SunMarc commented Jan 14, 2026

@bot /style

@github-actions
Copy link
Copy Markdown
Contributor

Style fix is beginning .... View the workflow run here.

@github-actions
Copy link
Copy Markdown
Contributor

View the CircleCI Test Summary for this PR:

https://huggingface.co/spaces/transformers-community/circle-ci-viz?pr=39683&sha=824bdc

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

🐛 Bug Report: Accelerate config to disable torch dynamo is ignored by transformers automatic compilation

4 participants