Skip to content

🚨🚨🚨🚨🚨🚨🚨🚨🚨 default to "auto" dtype#34919

Merged
ArthurZucker merged 9 commits intomainfrom
default-auto
Dec 11, 2025
Merged

🚨🚨🚨🚨🚨🚨🚨🚨🚨 default to "auto" dtype#34919
ArthurZucker merged 9 commits intomainfrom
default-auto

Conversation

@ArthurZucker
Copy link
Copy Markdown
Collaborator

What does this PR do?

Fixes #34743 by defaulting to auto dtype (meaning the type saved in the config) instead of always using torch.float32.

This is hugely breaking

@HuggingFaceDocBuilderDev
Copy link
Copy Markdown

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@github-actions
Copy link
Copy Markdown
Contributor

View the CircleCI Test Summary for this PR:

https://huggingface.co/spaces/transformers-community/circle-ci-viz?pr=34919&sha=b40250

@ArthurZucker ArthurZucker merged commit f5aa90d into main Dec 11, 2025
23 of 26 checks passed
@ArthurZucker ArthurZucker deleted the default-auto branch December 11, 2025 10:25
LysandreJik added a commit that referenced this pull request Dec 11, 2025
SangbumChoi pushed a commit to SangbumChoi/transformers that referenced this pull request Jan 23, 2026
* default to `"auto"` dtype

* the actual change?

* up?

* style

* up?

* only sam models were broken with this
SangbumChoi pushed a commit to SangbumChoi/transformers that referenced this pull request Jan 23, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

High cpu memory usage as bf16 model is auto loaded as fp32

2 participants