Skip to content
This repository was archived by the owner on Feb 7, 2025. It is now read-only.

Conversation

@Warvito
Copy link
Collaborator

@Warvito Warvito commented Mar 5, 2023

Fix #283

Warvito added 2 commits March 5, 2023 19:38
Signed-off-by: Walter Hugo Lopez Pinaya <ianonimato@hotmail.com>
Signed-off-by: Walter Hugo Lopez Pinaya <ianonimato@hotmail.com>
@Warvito Warvito linked an issue Mar 5, 2023 that may be closed by this pull request
@Warvito
Copy link
Collaborator Author

Warvito commented Mar 5, 2023

Found some problem with the flag in the diffusion model. Fixed it here

@Warvito Warvito merged commit 37d2953 into main Mar 5, 2023
@Warvito Warvito deleted the 283-add-flash-attention-option-to-autoencoderkl branch March 18, 2023 19:44
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add flash attention option to AutoencoderKL

2 participants