Skip to content

qa: re-run modular converter when the script itself is modified#45528

Merged
tarekziade merged 1 commit intomainfrom
tarekziade-consistency
Apr 20, 2026
Merged

qa: re-run modular converter when the script itself is modified#45528
tarekziade merged 1 commit intomainfrom
tarekziade-consistency

Conversation

@tarekziade
Copy link
Copy Markdown
Collaborator

@tarekziade tarekziade commented Apr 20, 2026

What does this PR do?

  • When changing the checker itself, we want to ignore the guaranteed_no_diff shortcut since changing the checker might impact models not in the PR

@tarekziade tarekziade requested a review from zucchini-nlp April 20, 2026 10:08
Copy link
Copy Markdown
Member

@zucchini-nlp zucchini-nlp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Approving to unblock CI, but I would love you to verify it is indeed a different issue before merging. The PR I merged just this morning affects exactly these classes in modular

Comment on lines -61 to -64
high_res_size: dict
high_res_resample: Union["PILImageResampling", int]
high_res_image_mean: float | list[float] | tuple[float, ...]
high_res_image_std: float | list[float] | tuple[float, ...]
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

actually very much looks like caused by #45492, could you verify it is really unrelated?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What happened is that the PR change the modular conversion process and that was impacting other models as well. They were not checked in the PR because it's ignored for models not changed in the PR.

@HuggingFaceDocBuilderDev
Copy link
Copy Markdown

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@tarekziade tarekziade requested a review from Cyrilvallez April 20, 2026 12:03
@github-actions
Copy link
Copy Markdown
Contributor

[For maintainers] Suggested jobs to run (before merge)

run-slow: conditional_detr, deepseek_vl, deepseek_vl_hybrid, deformable_detr, ernie4_5_vl_moe, grounding_dino, mask2former, paddleocr_vl, rt_detr, segformer, smolvlm, video_llama_3

@tarekziade tarekziade force-pushed the tarekziade-consistency branch from 53c0123 to 4efaa53 Compare April 20, 2026 12:15
@tarekziade tarekziade changed the title chore: fix consistency qa: re-run fill modular converter when the script itself is modified Apr 20, 2026
Comment on lines +109 to +115
# Changes to any of these files can alter the generated output for every modular model,
# so touching them must force a full re-check (see `converter_changed_in_diff`).
CONVERTER_FILES = {
"utils/modular_model_converter.py",
"utils/create_dependency_mapping.py",
}

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

great!

class ConverterChangedInDiffTest(unittest.TestCase):
"""Regression guard for PR #45492: changes to the converter alone must force a full check."""

def _patch_modified(self, files):
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm not sure if we need unittests tho, I don't think anyone will delete modular_model_converter.py from the checker

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is not deleting the file, it's just simulating a change in modular_model_converter.py in the diff to make sure that we detect it, so we run the modular converter on all model like in main and not partially skip. It's what happened in #45492

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah, I got it, I meant delete the you added for "diff on modular_model_converter.py" with a commont why it is there

…o_diff` shortcut since changing the checker might impact models not in the PR
@tarekziade tarekziade force-pushed the tarekziade-consistency branch from 4efaa53 to cb3ed62 Compare April 20, 2026 13:39
@tarekziade tarekziade changed the title qa: re-run fill modular converter when the script itself is modified qa: re-run modular converter when the script itself is modified Apr 20, 2026
@tarekziade tarekziade added this pull request to the merge queue Apr 20, 2026
Merged via the queue into main with commit ef97a75 Apr 20, 2026
19 checks passed
@tarekziade tarekziade deleted the tarekziade-consistency branch April 20, 2026 16:29
lvliang-intel pushed a commit to lvliang-intel/transformers that referenced this pull request Apr 21, 2026
…ingface#45528)

when changing the checker itself, we want to ignore the `guaranteed_no_diff` shortcut since changing the checker might impact models not in the PR
artem-spector pushed a commit to artem-spector/transformers that referenced this pull request Apr 21, 2026
…ingface#45528)

when changing the checker itself, we want to ignore the `guaranteed_no_diff` shortcut since changing the checker might impact models not in the PR
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants