According [flash-attention]( https://github.com/Dao-AILab/flash-attention), I modify the code for supporting flashattn v2.0.1
According flash-attention, I modify the code for supporting flashattn v2.0.1