Skip to content
This repository was archived by the owner on Feb 7, 2025. It is now read-only.

Add option to use flash attention#222

Merged
Warvito merged 5 commits intomainfrom
210-add-option-to-use-memory-efficient-attention-to-the-diffusion-unet
Feb 27, 2023
Merged

Add option to use flash attention#222
Warvito merged 5 commits intomainfrom
210-add-option-to-use-memory-efficient-attention-to-the-diffusion-unet

Commits

Commits on Feb 4, 2023

Commits on Feb 26, 2023