-
Notifications
You must be signed in to change notification settings - Fork 33.1k
Fix cache-related tests #39676
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix cache-related tests #39676
Changes from all commits
ef1ebd9
00d9b76
3df3f95
daeb4d8
fb9534b
56758c9
86fafef
008d93f
6d34e6f
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -403,7 +403,6 @@ def attention_mask_padding_matches_padding_free_with_position_ids( | |
| logits_padded = res_padded.logits[inputs_dict["attention_mask"].bool()] | ||
| logits_padfree = res_padfree.logits[0] | ||
|
|
||
| torch.testing.assert_close(logits_padded.argmax(-1), logits_padfree.argmax(-1), rtol=0, atol=0) | ||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. ok, can't find it on the common test file, so fine.
Member
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. this one doesn't make sense because we want to check logits, not sampled argmax tokens. Even tiny diff in logits can give different tokens, and the next line check with |
||
| # acceptable numerical instability | ||
| tol = torch.finfo(torch.bfloat16).eps | ||
| torch.testing.assert_close(logits_padded, logits_padfree, rtol=tol, atol=tol) | ||
|
|
||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. do you know what causse this changes?
Member
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. yep, after #39374 we started using higher default max length. Text-only generation pipe already uses it, so it's fine |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
was it deleted at some point and here you just add it back?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yep, accidentally deleted RoPE 🙈