Picture for Qixin Chang

Qixin Chang

FastAttention: Extend FlashAttention2 to NPUs and Low-resource GPUs

Add code
Oct 22, 2024
Viaarxiv icon