Skip to content

Fix failing tests due to no attribute pad_token_id#43453

Merged
zucchini-nlp merged 17 commits intohuggingface:mainfrom
Sai-Suraj-27:fix_missing_attr
Jan 28, 2026
Merged

Fix failing tests due to no attribute pad_token_id#43453
zucchini-nlp merged 17 commits intohuggingface:mainfrom
Sai-Suraj-27:fix_missing_attr

Conversation

@Sai-Suraj-27
Copy link
Copy Markdown
Contributor

@Sai-Suraj-27 Sai-Suraj-27 commented Jan 23, 2026

What does this PR do?

Fixes these failing PhiIntegrationTest, Glm46VIntegrationTest, Qwen3VLMoeIntegrationTest & Qwen2_5OmniModelIntegrationTest tests.

image

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

@zucchini-nlp (Continuation to #41541 #43398) @Rocketknight1

Copy link
Copy Markdown
Member

@zucchini-nlp zucchini-nlp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for fixing these, let's fix all pad_token_id's in this PR!

Overall lgtm, we just need to update modular files and run make fix-repo

Dictionary containing the configuration parameters for the RoPE embeddings. The dictionary should contain
a value for `rope_theta` and optionally parameters used for scaling in case you want to use RoPE
with longer `max_position_embeddings`.
pad_token_id (`int | None`, *optional*): <fill_docstring>
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could you fix the docstring and add pad_token_id in modular file?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, for the dealy. Fixed the Doc-string. Can you take a look now? The failing quality check in CI is asking to add a duplicate line: self.pad_token_id = pad_token_id. So, I am not sure if it's correct.

Screenshot from 2026-01-28 17-18-03

self.seconds_per_chunk = seconds_per_chunk # zf
self.audio_start_token_id = audio_start_token_id # zf
self.audio_end_token_id = audio_end_token_id # zf
self.pad_token_id = pad_token_id
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

has to go in modular file first

@github-actions
Copy link
Copy Markdown
Contributor

[For maintainers] Suggested jobs to run (before merge)

run-slow: glm4v, glm_image, glm_ocr, phi, qwen2_5_omni, qwen3_vl_moe

Copy link
Copy Markdown
Member

@zucchini-nlp zucchini-nlp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's merge! There might be more models left, but we'll have a PR with @Rocketknight1 on that

Thanks for your efforts @Sai-Suraj-27 !

@zucchini-nlp zucchini-nlp enabled auto-merge (squash) January 28, 2026 13:00
@HuggingFaceDocBuilderDev
Copy link
Copy Markdown

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@zucchini-nlp zucchini-nlp merged commit 2864600 into huggingface:main Jan 28, 2026
19 checks passed
@hmellor hmellor added the for patch Tag issues / labels that should be included in the next patch label Jan 29, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

for patch Tag issues / labels that should be included in the next patch

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants