LlamaAttention forward function type hint is incorrect #38739#38795
LlamaAttention forward function type hint is incorrect #38739#38795ArkVex wants to merge 5 commits intohuggingface:mainfrom ArkVex:main
Conversation
|
Hi @ArkVex can you run |
I didnt get that...could you plz explain? |
|
Hi @ArkVex, if you look at the tests on this PR, "check_repository_consistency" is failing. The reason is that some other models copy from Llama, and those copies don't match after this PR. You should run |
Thanks for pointing it out...i will do the same |
|
@Rocketknight1 you there? |
|
Seems like |
|
Hi @ArkVex, the reason it's failing is that the PR is on your fork's
|
Okk oKk |

Hi, this PR fixes a small issue in the LlamaAttention class. The return type in the forward method currently shows three values, but the function actually returns only two. This seems to have been missed during the attention refactor (possibly in PR #35235).
I’ve updated the type hint to reflect the actual return values, just to avoid confusion for anyone reading or using the code. Let me know if any other changes are needed. Happy to help!