Skip to content

Conversation

@takuma104
Copy link
Contributor

@takuma104 takuma104 commented May 25, 2023

Discussed in #3551. This is a fix for the issue where xFormers gets deactivated when LoRA is loaded in an environment where xFormers is enabled. As a solution, I have made it so that the LoRAXFormersAttnProcessor is used instead of the LoRAAttnProcessor when xFormers is enabled.

Todo:

  • Adding test code

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented May 25, 2023

The documentation is not available anymore as the PR was closed or merged.

@takuma104 takuma104 marked this pull request as ready for review May 25, 2023 16:05
Copy link
Contributor

@patrickvonplaten patrickvonplaten left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great!

@patrickvonplaten
Copy link
Contributor

@sayakpaul can you take a look? This looks like it's good to merge to me

Copy link
Member

@sayakpaul sayakpaul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you!

@sayakpaul sayakpaul merged commit 67cf044 into huggingface:main May 26, 2023
yoonseokjin pushed a commit to yoonseokjin/diffusers that referenced this pull request Dec 25, 2023
…hen xFormers is enabled (huggingface#3556)

* fix to use LoRAXFormersAttnProcessor

* add test

* using new LoraLoaderMixin.save_lora_weights

* add test_lora_save_load_with_xformers
AmericanPresidentJimmyCarter pushed a commit to AmericanPresidentJimmyCarter/diffusers that referenced this pull request Apr 26, 2024
…hen xFormers is enabled (huggingface#3556)

* fix to use LoRAXFormersAttnProcessor

* add test

* using new LoraLoaderMixin.save_lora_weights

* add test_lora_save_load_with_xformers
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants