Skip to content

Update Torch version check for flex attention#45445

Closed
ZSLsherly wants to merge 1 commit intohuggingface:mainfrom
ZSLsherly:ZSLsherly-patch-1
Closed

Update Torch version check for flex attention#45445
ZSLsherly wants to merge 1 commit intohuggingface:mainfrom
ZSLsherly:ZSLsherly-patch-1

Conversation

@ZSLsherly
Copy link
Copy Markdown

@ZSLsherly ZSLsherly commented Apr 15, 2026

This commit corrects the PyTorch version check for importing AuxRequest from torch.nn.attention.flex_attention(line51). The AuxRequest class was actually introduced in PyTorch 2.9.1, not 2.9.0. The current code attempts to import it for any version >= 2.9.0, which causes an ImportError in PyTorch 2.9.0 environments.
You can view the introduced version of AuxRequest at https://docs.pytorch.org/docs/2.9/nn.attention.flex_attention.html.
Fixes #45446

This commit corrects the PyTorch version check for importing `AuxRequest` from `torch.nn.attention.flex_attention`. The `AuxRequest` class was actually introduced in PyTorch 2.9.1, not 2.9.0. The current code attempts to import it for any version >= 2.9.0, which causes an `ImportError` in PyTorch 2.9.0 environments.
Copilot AI review requested due to automatic review settings April 15, 2026 03:09
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adjusts the flex-attention integration’s PyTorch feature-gating to avoid importing AuxRequest on unsupported PyTorch versions, preventing ImportError in PyTorch 2.9.0 environments.

Changes:

  • Update _TORCH_FLEX_USE_AUX threshold from torch>=2.9.0 to torch>=2.9.1 for AuxRequest import gating.



_TORCH_FLEX_USE_AUX = is_torch_greater_or_equal("2.9.0")
_TORCH_FLEX_USE_AUX = is_torch_greater_or_equal("2.9.1")
Copy link

Copilot AI Apr 15, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After updating the AuxRequest availability check to 2.9.1, the docstring in get_flex_attention_lse_kwargs still says the behavior changes in torch 2.9 and refers to the wrong argument names (mentions aux_request / “python version”, but the code uses return_aux and the decision is based on the torch version). Please update that docstring to match the new 2.9.1 threshold and the actual kwargs.

Copilot uses AI. Check for mistakes.
Copy link
Copy Markdown
Member

@Rocketknight1 Rocketknight1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, sounds good to me!

@Rocketknight1 Rocketknight1 enabled auto-merge April 15, 2026 10:52
@vasqu
Copy link
Copy Markdown
Contributor

vasqu commented Apr 15, 2026

No, not correct

@vasqu vasqu disabled auto-merge April 15, 2026 10:52
@Rocketknight1
Copy link
Copy Markdown
Member

Wait, is the issue purely hallucinated?

@vasqu
Copy link
Copy Markdown
Contributor

vasqu commented Apr 15, 2026

I've checked locally

uv pip freeze | grep torch
torch==2.9.0
...
python -c "from torch.nn.attention.flex_attention import AuxRequest"

runs fine without errors; the docs just start from 2.9.1 (I guess the latest minor for that version)

@vasqu
Copy link
Copy Markdown
Contributor

vasqu commented Apr 15, 2026

I suspect it was purely based on the docs but wasn't run at all by the contributor

@Rocketknight1
Copy link
Copy Markdown
Member

My fault, I was also trusting the docs link!

@HuggingFaceDocBuilderDev
Copy link
Copy Markdown

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@ZSLsherly
Copy link
Copy Markdown
Author

Sorry, this one's on me. Retested with a clean environment and you're right—torch 2.9.0 works fine. Looks like my old env was just borked. Thanks for the spot

@ZSLsherly ZSLsherly closed this Apr 15, 2026
@vasqu
Copy link
Copy Markdown
Contributor

vasqu commented Apr 15, 2026

No worries, glad it got resolved

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Incorrect PyTorch version check for AuxRequest import in flex_attention

5 participants