Skip to content

Restore is_torch_fx_available for trust_remote_code backwards compatibility#44615

Open
MaybeSam05 wants to merge 1 commit intohuggingface:mainfrom
MaybeSam05:fix-restore-is-torch-fx-available
Open

Restore is_torch_fx_available for trust_remote_code backwards compatibility#44615
MaybeSam05 wants to merge 1 commit intohuggingface:mainfrom
MaybeSam05:fix-restore-is-torch-fx-available

Conversation

@MaybeSam05
Copy link
Copy Markdown

What does this PR do?

Restores is_torch_fx_available in transformers.utils.import_utils as a backwards-compatibility shim so that Hub models loaded with trust_remote_code=True that still import this symbol no longer raise ImportError on Transformers v5+.

Context: In v5, is_torch_fx_available was removed from import_utils. Many community models (e.g. deepseek-ai/deepseek-moe-16b-base) ship their own modeling_*.py and still do from transformers.utils.import_utils import is_torch_fx_available, which breaks at import time when using v5.

Changes:

  • Re-add is_torch_fx_available() in src/transformers/utils/import_utils.py.
  • The function emits a one-time DeprecationWarning (via @lru_cache), returns False if PyTorch is not available, and otherwise checks that torch.fx can be imported and returns True/False.
  • Under the v5 requirement (PyTorch >= 2.4), torch.fx is always available, so behavior for existing callers is unchanged.

Fixes #44561

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

@ArthurZucker @Rocketknight1 — this restores a symbol removed in v5 that breaks trust_remote_code models (see #44561). Optional: @Cyrilvallez for model-loading/import-utils context.

@Cyrilvallez
Copy link
Copy Markdown
Member

cc @ArthurZucker wdyt? We could consider reintroducing, but IMO a bit pointless as again we cannot guarantee BC for remote models, and it's bound to broke in another way soon (if it's not already broken in another fashion already)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Removal of is_torch_fx_available in v5.0 breaks trust_remote_code models

2 participants