-
Notifications
You must be signed in to change notification settings - Fork 6.7k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Describe the bug
I got the following error message while running the train_dreambooth.py script:
AttributeError: 'DistributedDataParallel' object has no attribute 'config'
It only occurs when I use multi-gpu setting.
Reproduction
Follow the Finetune section in the tutorial, except when configuring accelerate, chose multi-gpu instead.
Logs
No response
System Info
diffusersversion: 0.17.0.dev0- Platform: Linux-4.15.0-48-generic-x86_64-with-glibc2.17
- Python version: 3.8.16
- PyTorch version (GPU?): 2.0.1+cu117 (True)
- Huggingface_hub version: 0.14.1
- Transformers version: 4.29.2
- Accelerate version: 0.19.0
- xFormers version: not installed
- Using GPU in script?: YES
- Using distributed or parallel set-up in script?: YES
patrickvonplaten
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working