Skip to content

LlamaAttention forward function type hint is incorrect from new Branch #38998

Merged
qubvel merged 7 commits intohuggingface:mainfrom
ArkVex:main
Jul 1, 2025
Merged

LlamaAttention forward function type hint is incorrect from new Branch #38998
qubvel merged 7 commits intohuggingface:mainfrom
ArkVex:main

Conversation

@ArkVex
Copy link
Copy Markdown
Contributor

@ArkVex ArkVex commented Jun 24, 2025

Hi, this PR fixes a small issue in the LlamaAttention class. The return type in the forward method currently shows three values, but the function actually returns only two. This seems to have been missed during the attention refactor (possibly in PR #35235).

I’ve updated the type hint to reflect the actual return values, just to avoid confusion for anyone reading or using the code. Let me know if any other changes are needed. Happy to help!

@ArkVex
Copy link
Copy Markdown
Contributor Author

ArkVex commented Jun 24, 2025

@Rocketknight1 trying again

@ArkVex
Copy link
Copy Markdown
Contributor Author

ArkVex commented Jun 24, 2025

I think I shall close the fork are refork it again

@qubvel
Copy link
Copy Markdown
Contributor

qubvel commented Jun 24, 2025

Hey @ArkVex, thanks for fixing typehints! To fix repo-cosistency issue can you please run the following command and commit the changes?

python utils/check_modular_conversion.py --fix_and_overwrite

@ArkVex
Copy link
Copy Markdown
Contributor Author

ArkVex commented Jun 25, 2025

@qubvel still throwing the same error

@ArkVex
Copy link
Copy Markdown
Contributor Author

ArkVex commented Jun 25, 2025

Idk what is wrong in this...i did such minute change and it is troubling so much

@qubvel qubvel self-requested a review June 27, 2025 14:24
@qubvel
Copy link
Copy Markdown
Contributor

qubvel commented Jun 27, 2025

@ArkVex, I hope you don't mind, I've applied the command above and pushed the changes

@HuggingFaceDocBuilderDev
Copy link
Copy Markdown

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@qubvel qubvel merged commit 20901f1 into huggingface:main Jul 1, 2025
20 checks passed
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
* helo llama

* helo llama

* helo llama

* apply modular

* fix dia

---------

Co-authored-by: qubvel <qubvel@gmail.com>
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
* helo llama

* helo llama

* helo llama

* apply modular

* fix dia

---------

Co-authored-by: qubvel <qubvel@gmail.com>
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
* helo llama

* helo llama

* helo llama

* apply modular

* fix dia

---------

Co-authored-by: qubvel <qubvel@gmail.com>
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
* helo llama

* helo llama

* helo llama

* apply modular

* fix dia

---------

Co-authored-by: qubvel <qubvel@gmail.com>
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
* helo llama

* helo llama

* helo llama

* apply modular

* fix dia

---------

Co-authored-by: qubvel <qubvel@gmail.com>
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
* helo llama

* helo llama

* helo llama

* apply modular

* fix dia

---------

Co-authored-by: qubvel <qubvel@gmail.com>
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
* helo llama

* helo llama

* helo llama

* apply modular

* fix dia

---------

Co-authored-by: qubvel <qubvel@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants