Skip to content

Could not run train_dreambooth with multi-GPU setting #3651

@dangkhoasdc

Description

@dangkhoasdc

Describe the bug

I got the following error message while running the train_dreambooth.py script:

AttributeError: 'DistributedDataParallel' object has no attribute 'config'

It only occurs when I use multi-gpu setting.

Reproduction

Follow the Finetune section in the tutorial, except when configuring accelerate, chose multi-gpu instead.

Logs

No response

System Info

  • diffusers version: 0.17.0.dev0
  • Platform: Linux-4.15.0-48-generic-x86_64-with-glibc2.17
  • Python version: 3.8.16
  • PyTorch version (GPU?): 2.0.1+cu117 (True)
  • Huggingface_hub version: 0.14.1
  • Transformers version: 4.29.2
  • Accelerate version: 0.19.0
  • xFormers version: not installed
  • Using GPU in script?: YES
  • Using distributed or parallel set-up in script?: YES

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions