[shardformer] llama support flash attention#4185
Merged
FrankLeeeee merged 10 commits intohpcaitech:feature/flash-attention-shardformerfrom Jul 6, 2023
Merged
[shardformer] llama support flash attention#4185FrankLeeeee merged 10 commits intohpcaitech:feature/flash-attention-shardformerfrom
FrankLeeeee merged 10 commits intohpcaitech:feature/flash-attention-shardformerfrom
Commits
Commits on Jul 4, 2023
Commits on Jul 5, 2023
Commits on Jul 6, 2023
- authored
- committed
- committed
- committed