Fix transformers/torch errors on nn.RMSNorm by pinning transformers.#6458
Fix transformers/torch errors on nn.RMSNorm by pinning transformers.#6458
Conversation
|
Should be fixed in latest PR to HF Transformers - won't need to complete this PR but leaving until transformers PR is merged. |
|
@loadams on 1.18.0 we will upgrade to pt2.4.0. |
Thanks @nelyahu - turns out this is a bug in transformers and we will need that fixed regardless, but good to know, thanks! |
|
PR in transformers is here: huggingface/transformers#33177 |
|
@loadams it seems that the Gaudi2 pipeline is still broken. |
|
The PR has merged into transformers (took longer than anticipated) but it looks to be running again, so no need to pin anything. |
HPU pipeline was failing due to updated transformers package that seems to have a dependency on torch 2.4: https://github.com/microsoft/DeepSpeed/actions/runs/10586654075/job/29335873815
Opened an issue with transformers to investigate: huggingface/transformers#33176
CC: @nelyahu - we can also fix this when/if there is a package with torch 2.4