Skip to content

[Quantization] Misc tests fixes#42940

Merged
MekkCyber merged 2 commits intomainfrom
fix-remainings-tests
Dec 18, 2025
Merged

[Quantization] Misc tests fixes#42940
MekkCyber merged 2 commits intomainfrom
fix-remainings-tests

Conversation

@MekkCyber
Copy link
Copy Markdown
Contributor

What does this PR do?

Fixing various Quantization tests

@HuggingFaceDocBuilderDev
Copy link
Copy Markdown

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

matching_group_name = next(name for name, val in match_object.groupdict().items() if val is not None)
source_pattern_that_matched = self.source_patterns[int(matching_group_name[1:])]
# If we matched, we always replace with the first target pattern, in case we have several (one to many transform)
replacement = self.target_patterns[0]
Copy link
Copy Markdown
Contributor Author

@MekkCyber MekkCyber Dec 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

in reverse ops, I think we should remove the $ and ^ characters if used as regex patterns in source_patterns (which become target patterns during saving) @Cyrilvallez

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see! The ^ is already being removed! The correct place to do it is https://github.com/huggingface/transformers/blob/main/src/transformers/core_model_loading.py#L302-L316 those lines 🤗 So we keep all those transforms at the same location

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh indeed thanks! updated that 👍

Comment on lines +145 to +146
source_patterns=".weight_g$",
target_patterns=".parametrizations.weight.original0",
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we need this for fp-quant since it has a param called weight_global_scale that will get replaced if we don't use a $ delimiter in the regex

@github-actions
Copy link
Copy Markdown
Contributor

[For maintainers] Suggested jobs to run (before merge)

run-slow: gptq, mxfp4

@MekkCyber MekkCyber merged commit d7dd443 into main Dec 18, 2025
26 checks passed
@MekkCyber MekkCyber deleted the fix-remainings-tests branch December 18, 2025 10:57
SangbumChoi pushed a commit to SangbumChoi/transformers that referenced this pull request Jan 23, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants