Skip to content

Skip test_eager_matches sdpa generate and update an integration test for blip-like models#39248

Merged
ydshieh merged 2 commits intomainfrom
fix_test_eager_matches_sdpa_generate
Jul 8, 2025
Merged

Skip test_eager_matches sdpa generate and update an integration test for blip-like models#39248
ydshieh merged 2 commits intomainfrom
fix_test_eager_matches_sdpa_generate

Conversation

@ydshieh
Copy link
Copy Markdown
Collaborator

@ydshieh ydshieh commented Jul 7, 2025

What does this PR do?

For skip, see this comment. Let's make the CI report clean though for now.

@ydshieh ydshieh changed the title Fix test_eager_matches sdpa generate Skip test_eager_matches sdpa generate and update an integration test for blip-like models Jul 7, 2025
@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented Jul 7, 2025

[For maintainers] Suggested jobs to run (before merge)

run-slow: blip_2, instructblip, instructblipvideo

@ydshieh
Copy link
Copy Markdown
Collaborator Author

ydshieh commented Jul 7, 2025

run-slow: blip_2, instructblip, instructblipvideo

@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented Jul 7, 2025

This comment contains run-slow, running the specified jobs:

models: ['models/blip_2', 'models/instructblip', 'models/instructblipvideo']
quantizations: [] ...

generated_text = processor.batch_decode(outputs, skip_special_tokens=True)[0]

expected_outputs = [0, 37, 7225, 1023, 9850, 7, 3, 9, 388, 3575, 53, 4954, 30, 8, 223, 13, 3, 9, 4459, 4049, 16, 8, 2214, 13, 3, 9, 3164, 690, 2815, 5, 37, 388, 19, 5119, 3, 9, 4459, 8677, 28, 46, 3575, 53, 1476, 5223, 12, 34, 6, 15495, 24, 3, 88, 19, 692, 112, 293, 10428, 44, 234, 1066, 145, 338, 3, 9, 50, 1106, 3522, 144, 42, 2192, 7919, 31, 7, 5, 37, 1023, 92, 1267, 3, 9, 381, 13, 119, 3203, 16, 8, 2458, 6, 379, 14264, 6, 9256, 7, 6, 11, 11718, 7, 5, 1] # fmt: skip
expected_outputs = [0, 37, 1023, 9850, 7, 3, 9, 388, 3575, 53, 4954, 30, 8, 223, 13, 3, 9, 4459, 4049, 16, 8, 2214, 13, 3, 9, 3164, 690, 2815, 5, 37, 388, 19, 5119, 3, 9, 4459, 8677, 28, 3, 9, 4459, 6177, 6, 11, 3, 88, 19, 338, 46, 3575, 53, 1476, 5223, 12, 8, 223, 13, 8, 4049, 5, 37, 1023, 19, 7225, 16, 24, 34, 1267, 3, 9, 388, 3575, 53, 4954, 30, 8, 223, 13, 3, 9, 4049, 16, 8, 2214, 13, 3, 9, 3164, 690, 2815, 5, 37, 388, 19, 338, 46, 3575, 53, 1476, 5223, 12, 8, 223, 13, 3, 9, 4049, 16, 8, 2214, 13, 3, 9, 3164, 690, 2815, 5, 37, 388, 19, 338, 46, 3575, 53, 1476, 5223, 12, 8, 223, 13, 3, 9, 4049, 16, 8, 2214, 13, 3, 9, 3164, 690, 2815, 5, 37, 1023, 19, 7225, 16, 24, 34, 1267, 3, 9, 388, 3575, 53, 4954, 30, 8, 223, 13, 3, 9, 4049, 16, 8, 2214, 13, 3, 9, 3164, 690, 2815, 5, 37, 388, 19, 338, 46, 3575, 53, 1476, 5223, 12, 8, 223, 13, 3, 9, 4049, 16, 8, 2214, 13, 3, 9, 3164, 690, 2815, 5, 1] # fmt: skip
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[core] Large/full refactor of from_pretrained (#36033) change the values here on T4.

The commit before would pass with the previous value on T4, but now we also change to A10, and that previous commit also fail on A10.

Let's just update the values.

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't know how much this is expected 😅

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The output is not too far off from the previous one, but it's indeed a bit strange why that PR would change the output values.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cc @Cyrilvallez if you want to dive into this (if so, remember to check on T4)

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could be also the dtype, I remember having some issues in prev versions before refactor. For ex: if a backbone needs to be loaded in fp32, somehow it was still converted first to fp16 and then casted back to fp32, can't remember the reason

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, that's indeed a likely reason!

@HuggingFaceDocBuilderDev
Copy link
Copy Markdown

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@ydshieh ydshieh requested a review from zucchini-nlp July 7, 2025 11:39
@ydshieh
Copy link
Copy Markdown
Collaborator Author

ydshieh commented Jul 7, 2025

The failing slow CI jobs are all in multi-gpu runners, which are not treated at this moment.

Copy link
Copy Markdown
Member

@zucchini-nlp zucchini-nlp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, these have to be skipped in the end anyway, unless we make it multimodal-friendly. I am adding a check for all backbones in #38974 :)

@ydshieh ydshieh merged commit a21557f into main Jul 8, 2025
20 of 21 checks passed
@ydshieh ydshieh deleted the fix_test_eager_matches_sdpa_generate branch July 8, 2025 08:38
rjgleaton pushed a commit to rjgleaton/transformers that referenced this pull request Jul 17, 2025
…t for blip-like models (huggingface#39248)

* skip

* skip

---------

Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
…t for blip-like models (huggingface#39248)

* skip

* skip

---------

Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
…t for blip-like models (huggingface#39248)

* skip

* skip

---------

Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
…t for blip-like models (huggingface#39248)

* skip

* skip

---------

Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
…t for blip-like models (huggingface#39248)

* skip

* skip

---------

Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
…t for blip-like models (huggingface#39248)

* skip

* skip

---------

Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
…t for blip-like models (huggingface#39248)

* skip

* skip

---------

Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
…t for blip-like models (huggingface#39248)

* skip

* skip

---------

Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants