Picture for Weigao Sun

Weigao Sun

MoM: Linear Sequence Modeling with Mixture-of-Memories

Add code
Feb 19, 2025
Viaarxiv icon

LASP-2: Rethinking Sequence Parallelism for Linear Attention and Its Hybrid

Add code
Feb 11, 2025
Viaarxiv icon

MiniMax-01: Scaling Foundation Models with Lightning Attention

Add code
Jan 14, 2025
Viaarxiv icon

LLaMA-MoE v2: Exploring Sparsity of LLaMA from Perspective of Mixture-of-Experts with Post-Training

Add code
Nov 24, 2024
Viaarxiv icon

Scaling Laws for Linear Complexity Language Models

Add code
Jun 24, 2024
Viaarxiv icon

Various Lengths, Constant Speed: Efficient Language Modeling with Lightning Attention

Add code
May 27, 2024
Viaarxiv icon

Unlocking the Secrets of Linear Complexity Sequence Model from A Unified Perspective

Add code
May 27, 2024
Viaarxiv icon

HGRN2: Gated Linear RNNs with State Expansion

Add code
Apr 11, 2024
Viaarxiv icon

Linear Attention Sequence Parallelism

Add code
Apr 03, 2024
Figure 1 for Linear Attention Sequence Parallelism
Figure 2 for Linear Attention Sequence Parallelism
Figure 3 for Linear Attention Sequence Parallelism
Figure 4 for Linear Attention Sequence Parallelism
Viaarxiv icon

MS-Net: A Multi-Path Sparse Model for Motion Prediction in Multi-Scenes

Add code
Mar 01, 2024
Viaarxiv icon