Fix PEFT x MoEs#43261
Merged
ArthurZucker merged 67 commits intomainfrom Jan 24, 2026
Merged
Conversation
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
BenjaminBossan
added a commit
to BenjaminBossan/peft
that referenced
this pull request
Jan 14, 2026
It was necessary to flatten the LoRA weights for 3d MoE, as LoRA always expected 2d weights (being nn.Linear).
Member
|
Just leaving some general comments on this PR:
|
* rank needed to be set to 2*r for concatenated gate up projection parameter so that PEFT allocates 2*r and matches the converted weights (using rank_pattern) * the weights needed to be transposed to match the counter parts * MoE in PEFT assumes (experts, in, out) but Mixtral MoE is transposed so we need to patch this assumption in PEFT for now
Contributor
|
View the CircleCI Test Summary for this PR: https://huggingface.co/spaces/transformers-community/circle-ci-viz?pr=43261&sha=92d0fa |
The latter does not exist in released PEFT versions and therefore is not an ideal target for this PR :)
…f_hub/1.3.2; python/3.13.2; torch/2.9.1; transformers/5.0.0.dev0;`
5 tasks
githubnemo
pushed a commit
to githubnemo/peft
that referenced
this pull request
Feb 27, 2026
Continuation of PR huggingface#2995. Background: huggingface/transformers#42491 and huggingface/transformers#43261. This change implements conversion operations for converting some existing PEFT checkpoints, mainly dealing with the fusing of MoE layers in transformers v5. The code added here is currently a copy from the code that exists in transformers which is supposed to be gated as soon PEFT v0.19 is released and use the code in this PR. The copying makes testing a bit difficult since there's currently no routing depending on the PEFT version in transformers. Older transformers versions, therefore, need patching to forcefully use the PEFT implementation of the conversion. As soon as the routing is implemented in transformers we can conditionally disable the patching.
6 tasks
6 tasks
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
What does this PR do?
Fixes #42491
This should serve as an example of how the weight loader can be re-used in other project.

The content is probably gonna be upstreamed to peft!
Current status:
What to expect:
a. we check the weight conversion mapping
b. if there are ops, we replace them with mapped peft ops
c. we just collect lora_A and lora_B together, process them like so:
CF: (Credits to @BenjaminBossan for the pic)
