Picture for Jingqi Tong

Jingqi Tong

LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training

Add code
Jun 24, 2024
Viaarxiv icon

Exploring the Compositional Deficiency of Large Language Models in Mathematical Reasoning

Add code
May 05, 2024
Viaarxiv icon