Skip to content
This repository was archived by the owner on Feb 7, 2025. It is now read-only.
This repository was archived by the owner on Feb 7, 2025. It is now read-only.

Add flash attention option to AutoencoderKL #283

@Warvito

Description

@Warvito

Similar to #210, add option to use or not efficient attention to AutoencoderKL

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Relationships

None yet

Development

No branches or pull requests

Issue actions