System Info
In src/transformers/integrations/flex_attention.py, the code currently checks for PyTorch version >= 2.9.0 to import AuxRequest from torch.nn.attention.flex_attention. However, AuxRequest was actually introduced in PyTorch 2.9.1.
According to the official PyTorch documentation, AuxRequest is available starting from version 2.9.1:https://docs.pytorch.org/docs/2.9/nn.attention.flex_attention.html.
Who can help?
No response
Information
Tasks
Reproduction
import torch
#This will fail in PyTorch 2.9.0
from torch.nn.attention.flex_attention import AuxRequest
Expected behavior
The version check should be updated to >= 2.9.1 to ensure compatibility with PyTorch 2.9.0.
System Info
In src/transformers/integrations/flex_attention.py, the code currently checks for PyTorch version >= 2.9.0 to import AuxRequest from torch.nn.attention.flex_attention. However, AuxRequest was actually introduced in PyTorch 2.9.1.
According to the official PyTorch documentation, AuxRequest is available starting from version 2.9.1:https://docs.pytorch.org/docs/2.9/nn.attention.flex_attention.html.
Who can help?
No response
Information
Tasks
examplesfolder (such as GLUE/SQuAD, ...)Reproduction
Expected behavior
The version check should be updated to >= 2.9.1 to ensure compatibility with PyTorch 2.9.0.