Skip to content

Update transformer_flux.py. Change float64 to float32#1

Merged
jsmidt merged 1 commit intomainfrom
jsmidt-patch-1
Aug 9, 2024
Merged

Update transformer_flux.py. Change float64 to float32#1
jsmidt merged 1 commit intomainfrom
jsmidt-patch-1

Conversation

@jsmidt
Copy link
Copy Markdown
Owner

@jsmidt jsmidt commented Aug 9, 2024

dtype=torch.float64 is overkill, and float64 is not defined for certain devices such as Apple Silicon mps.

What does this PR do?

Enables the flux transformer to be used on devices such as Apple Silicon mps by redefining float64 as float32, which does not negatively effect the output of the pipeline.

dtype=torch.float64 is overkill, and float64 is not defined for certain devices such as Apple Silicon mps.
@jsmidt jsmidt merged commit c990396 into main Aug 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant