diffusers currently supports the following PT 2.0 variant of attention processors
- AttnProcessor => AttnProcessor2_0
- AttnAddedKVProcessor => AttnAddedKVProcessor2_0
The following are not supported:
- SlicedAttnProcessor
- SlicedAttnAddedKVProcessor
LoRAAttnProcessor
- CustomDiffusionAttnProcessor
We should add SDPA versions of the above processors. This essentially eliminates the need to use xformers.