Skip to content

Pull requests: Dao-AILab/flash-attention

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Reviews
Assignee
Filter by who’s assigned
Assigned to nobody Loading
Sort

Pull requests list

[Cute, Bwd, Sm100] Add varlen for sm100 bwd
#2150 opened Jan 8, 2026 by jayhshah Loading…
[Cute] Fix two tests that were failing
#2149 opened Jan 8, 2026 by henrylhtsang Loading…
[Cute] Update deprecated cute DSL APIs
#2148 opened Jan 7, 2026 by henrylhtsang Loading…
[CUTE][SM90]Enable pack-gqa with broadcasted maskmods
#2145 opened Jan 7, 2026 by drisspg Loading…
[Cute] Clarify and fix subtle cachekey bug
#2143 opened Jan 6, 2026 by drisspg Loading…
score-mod backward SM90
#2137 opened Jan 5, 2026 by drisspg Loading…
block-sparse backward SM90
#2136 opened Jan 5, 2026 by drisspg Loading…
[Cute,Fwd,Sm120] FA Cute DSL sm12x
#2113 opened Dec 31, 2025 by johnnynunez Draft
[Cute,Fwd,Sm100] fp8 e4m3 and e5m2 support
#2109 opened Dec 29, 2025 by dcw02 Loading…
refactor llama test
#2107 opened Dec 29, 2025 by m3ngyang Loading…
[Fix typos] remove redundant double semicolons
#2106 opened Dec 29, 2025 by kisseternity Loading…
[Cute] Fix: arg pass in cute flash-attn inferface
#2101 opened Dec 27, 2025 by SeanLi-OI Loading…
Fix softmax incorrect row_max issue
#2083 opened Dec 17, 2025 by imbr92 Loading…
Fix TypeError when ColumnParallelLinear is None
#2080 opened Dec 17, 2025 by ailuntz Loading…
Reduce Chance of Build OOM
#2079 opened Dec 17, 2025 by Qubitium Loading…
Add missing code highlighting to the README
#2061 opened Dec 10, 2025 by bryant1410 Loading…
Update README.md
#2058 opened Dec 10, 2025 by eduardoruiz1999 Loading…
ProTip! Updated in the last three days: updated:>2026-01-05.