Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
76 commits
Select commit Hold shift + click to select a range
ba24f2a
more fixes
younesbelkada Sep 14, 2023
c17634c
up
younesbelkada Sep 15, 2023
2a6e535
up
younesbelkada Sep 15, 2023
01f6d1d
style
younesbelkada Sep 15, 2023
5a150b2
add in setup
younesbelkada Sep 15, 2023
961e776
oops
younesbelkada Sep 15, 2023
cdbe739
more changes
younesbelkada Sep 15, 2023
691368b
v1 rzfactor CI
younesbelkada Sep 18, 2023
7918851
Apply suggestions from code review
younesbelkada Sep 18, 2023
14db139
few todos
younesbelkada Sep 18, 2023
c06c40b
Merge branch 'main' into peftpart-1
younesbelkada Sep 18, 2023
d56a14d
protect torch import
younesbelkada Sep 18, 2023
ec87c19
style
younesbelkada Sep 18, 2023
40a6028
fix fuse text encoder
younesbelkada Sep 18, 2023
0c62ef3
Merge remote-tracking branch 'upstream/main' into peftpart-1
younesbelkada Sep 18, 2023
c4295c9
Update src/diffusers/loaders.py
younesbelkada Sep 19, 2023
4162ddf
replace with `recurse_replace_peft_layers`
younesbelkada Sep 19, 2023
1d13f40
keep old modules for BC
younesbelkada Sep 19, 2023
78a860d
adjustments on `adjust_lora_scale_text_encoder`
younesbelkada Sep 19, 2023
78a01d5
Merge branch 'main' into peftpart-1
younesbelkada Sep 19, 2023
ecbc714
Merge remote-tracking branch 'upstream/main' into peftpart-1
younesbelkada Sep 19, 2023
9d650c9
Merge branch 'peftpart-1' of https://github.com/younesbelkada/diffuse…
younesbelkada Sep 19, 2023
6f1adcd
nit
younesbelkada Sep 19, 2023
f890906
move tests
younesbelkada Sep 19, 2023
f8e87f6
add conversion utils
younesbelkada Sep 19, 2023
3ba2d4e
Merge remote-tracking branch 'upstream/main' into peftpart-1
younesbelkada Sep 19, 2023
dc83fa0
remove unneeded methods
younesbelkada Sep 19, 2023
b83fcba
use class method instead
younesbelkada Sep 19, 2023
74e33a9
oops
younesbelkada Sep 19, 2023
9cb8563
use `base_version`
younesbelkada Sep 19, 2023
c90f85d
fix examples
younesbelkada Sep 19, 2023
40a4894
fix CI
younesbelkada Sep 19, 2023
ea05959
fix weird error with python 3.8
younesbelkada Sep 19, 2023
27e3da6
fix
younesbelkada Sep 19, 2023
3d7c567
better fix
younesbelkada Sep 19, 2023
d01a292
style
younesbelkada Sep 19, 2023
e836b14
Apply suggestions from code review
younesbelkada Sep 20, 2023
cb48405
Apply suggestions from code review
younesbelkada Sep 20, 2023
325462d
add comment
younesbelkada Sep 20, 2023
b412adc
Apply suggestions from code review
younesbelkada Sep 20, 2023
b72ef23
conv2d support for recurse remove
younesbelkada Sep 20, 2023
e072655
added docstrings
younesbelkada Sep 20, 2023
bd46ae9
more docstring
younesbelkada Sep 20, 2023
724b52b
add deprecate
younesbelkada Sep 20, 2023
5e6f343
revert
younesbelkada Sep 20, 2023
71650d4
try to fix merge conflicts
younesbelkada Sep 20, 2023
920333f
Merge remote-tracking branch 'upstream/main' into peftpart-1
younesbelkada Sep 20, 2023
c7f2099
v1 tests
younesbelkada Sep 20, 2023
7fd5295
add new decorator
younesbelkada Sep 20, 2023
e65daa7
add saving utilities test
younesbelkada Sep 20, 2023
209081b
adapt tests a bit
younesbelkada Sep 20, 2023
43b237e
add save / from_pretrained tests
younesbelkada Sep 20, 2023
7e01caf
add saving tests
younesbelkada Sep 20, 2023
fc16c44
add scale tests
younesbelkada Sep 20, 2023
5805c02
Merge branch 'main' into peftpart-1
sayakpaul Sep 20, 2023
831bbaa
fix deps tests
younesbelkada Sep 21, 2023
d1d1f22
Merge branch 'peftpart-1' of https://github.com/younesbelkada/diffuse…
younesbelkada Sep 21, 2023
24b38a8
fix lora CI
younesbelkada Sep 21, 2023
651de85
fix tests
younesbelkada Sep 21, 2023
26af3ea
add comment
younesbelkada Sep 21, 2023
6e3d33b
fix
younesbelkada Sep 21, 2023
6903825
style
younesbelkada Sep 21, 2023
7aadd30
add slow tests
younesbelkada Sep 21, 2023
aedffbf
slow tests pass
younesbelkada Sep 21, 2023
2a64c88
style
younesbelkada Sep 21, 2023
557404f
Update src/diffusers/utils/import_utils.py
younesbelkada Sep 21, 2023
5de8506
Apply suggestions from code review
younesbelkada Sep 21, 2023
65fe519
circumvents pattern finding issue
younesbelkada Sep 21, 2023
9303d04
left a todo
younesbelkada Sep 21, 2023
29bfc56
Merge branch 'main' into peftpart-1
patrickvonplaten Sep 21, 2023
90974ea
Merge remote-tracking branch 'upstream/main' into peftpart-1
younesbelkada Sep 22, 2023
053c827
Merge branch 'peftpart-1' of https://github.com/younesbelkada/diffuse…
younesbelkada Sep 22, 2023
03f7431
Apply suggestions from code review
younesbelkada Sep 22, 2023
d4637f7
update hub path
younesbelkada Sep 22, 2023
17ac967
add lora workflow
younesbelkada Sep 22, 2023
5295aa2
fix
younesbelkada Sep 22, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
67 changes: 67 additions & 0 deletions .github/workflows/pr_test_peft_backend.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
name: Fast tests for PRs - PEFT backend

on:
pull_request:
branches:
- main

concurrency:
group: ${{ github.workflow }}-${{ github.head_ref || github.run_id }}
cancel-in-progress: true

env:
DIFFUSERS_IS_CI: yes
OMP_NUM_THREADS: 4
MKL_NUM_THREADS: 4
PYTEST_TIMEOUT: 60

jobs:
run_fast_tests:
strategy:
fail-fast: false
matrix:
config:
- name: LoRA
framework: lora
runner: docker-cpu
image: diffusers/diffusers-pytorch-cpu
report: torch_cpu_lora


name: ${{ matrix.config.name }}

runs-on: ${{ matrix.config.runner }}

container:
image: ${{ matrix.config.image }}
options: --shm-size "16gb" --ipc host -v /mnt/hf_cache:/mnt/cache/

defaults:
run:
shell: bash

steps:
- name: Checkout diffusers
uses: actions/checkout@v3
with:
fetch-depth: 2

- name: Install dependencies
run: |
apt-get update && apt-get install libsndfile1-dev libgl1 -y
python -m pip install -e .[quality,test]
python -m pip install git+https://github.com/huggingface/accelerate.git
python -m pip install -U git+https://github.com/huggingface/transformers.git
python -m pip install -U git+https://github.com/huggingface/peft.git

- name: Environment
run: |
python utils/print_env.py

- name: Run fast PyTorch LoRA CPU tests with PEFT backend
if: ${{ matrix.config.framework == 'lora' }}
run: |
python -m pytest -n 2 --max-worker-restart=0 --dist=loadfile \
-s -v \
--make-reports=tests_${{ matrix.config.report }} \
tests/lora/test_lora_layers_peft.py
316 changes: 198 additions & 118 deletions src/diffusers/loaders.py

Large diffs are not rendered by default.

31 changes: 19 additions & 12 deletions src/diffusers/models/lora.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,18 +25,25 @@
logger = logging.get_logger(__name__) # pylint: disable=invalid-name


def adjust_lora_scale_text_encoder(text_encoder, lora_scale: float = 1.0):
for _, attn_module in text_encoder_attn_modules(text_encoder):
if isinstance(attn_module.q_proj, PatchedLoraProjection):
attn_module.q_proj.lora_scale = lora_scale
attn_module.k_proj.lora_scale = lora_scale
attn_module.v_proj.lora_scale = lora_scale
attn_module.out_proj.lora_scale = lora_scale

for _, mlp_module in text_encoder_mlp_modules(text_encoder):
if isinstance(mlp_module.fc1, PatchedLoraProjection):
mlp_module.fc1.lora_scale = lora_scale
mlp_module.fc2.lora_scale = lora_scale
def adjust_lora_scale_text_encoder(text_encoder, lora_scale: float = 1.0, use_peft_backend: bool = False):
if use_peft_backend:
from peft.tuners.lora import LoraLayer

for module in text_encoder.modules():
if isinstance(module, LoraLayer):
module.scaling[module.active_adapter] = lora_scale
Comment on lines +29 to +34
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💯

else:
for _, attn_module in text_encoder_attn_modules(text_encoder):
if isinstance(attn_module.q_proj, PatchedLoraProjection):
attn_module.q_proj.lora_scale = lora_scale
attn_module.k_proj.lora_scale = lora_scale
attn_module.v_proj.lora_scale = lora_scale
attn_module.out_proj.lora_scale = lora_scale

for _, mlp_module in text_encoder_mlp_modules(text_encoder):
if isinstance(mlp_module.fc1, PatchedLoraProjection):
mlp_module.fc1.lora_scale = lora_scale
mlp_module.fc2.lora_scale = lora_scale


class LoRALinearLayer(nn.Module):
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -303,7 +303,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Excellent use of reusing subclass variable (use_peft_backend) here!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks !


if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -301,7 +301,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
2 changes: 1 addition & 1 deletion src/diffusers/pipelines/controlnet/pipeline_controlnet.py
Original file line number Diff line number Diff line change
Expand Up @@ -291,7 +291,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -315,7 +315,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -442,7 +442,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -315,8 +315,8 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder_2, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)
adjust_lora_scale_text_encoder(self.text_encoder_2, lora_scale, self.use_peft_backend)

prompt = [prompt] if isinstance(prompt, str) else prompt

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -288,8 +288,8 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder_2, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)
adjust_lora_scale_text_encoder(self.text_encoder_2, lora_scale, self.use_peft_backend)

prompt = [prompt] if isinstance(prompt, str) else prompt

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -326,8 +326,8 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder_2, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)
adjust_lora_scale_text_encoder(self.text_encoder_2, lora_scale, self.use_peft_backend)

prompt = [prompt] if isinstance(prompt, str) else prompt

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -308,7 +308,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -301,7 +301,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -332,7 +332,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -213,7 +213,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -481,7 +481,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -278,7 +278,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -309,7 +309,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -302,7 +302,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -375,7 +375,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -297,7 +297,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -211,7 +211,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -272,7 +272,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -244,7 +244,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -221,7 +221,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -256,7 +256,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -446,7 +446,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -244,7 +244,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -240,7 +240,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -346,7 +346,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -296,7 +296,7 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)

if prompt is not None and isinstance(prompt, str):
batch_size = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -264,8 +264,8 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder_2, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)
adjust_lora_scale_text_encoder(self.text_encoder_2, lora_scale, self.use_peft_backend)

prompt = [prompt] if isinstance(prompt, str) else prompt

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -271,8 +271,8 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder_2, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)
adjust_lora_scale_text_encoder(self.text_encoder_2, lora_scale, self.use_peft_backend)

prompt = [prompt] if isinstance(prompt, str) else prompt

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -420,8 +420,8 @@ def encode_prompt(
self._lora_scale = lora_scale

# dynamically adjust the LoRA scale
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder_2, lora_scale)
adjust_lora_scale_text_encoder(self.text_encoder, lora_scale, self.use_peft_backend)
adjust_lora_scale_text_encoder(self.text_encoder_2, lora_scale, self.use_peft_backend)

prompt = [prompt] if isinstance(prompt, str) else prompt

Expand Down
Loading