Picture for Jingqi Tong

Jingqi Tong

LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training

Add code
Jun 24, 2024
Figure 1 for LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training
Figure 2 for LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training
Figure 3 for LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training
Figure 4 for LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training
Viaarxiv icon

Exploring the Compositional Deficiency of Large Language Models in Mathematical Reasoning

Add code
May 05, 2024
Figure 1 for Exploring the Compositional Deficiency of Large Language Models in Mathematical Reasoning
Figure 2 for Exploring the Compositional Deficiency of Large Language Models in Mathematical Reasoning
Figure 3 for Exploring the Compositional Deficiency of Large Language Models in Mathematical Reasoning
Figure 4 for Exploring the Compositional Deficiency of Large Language Models in Mathematical Reasoning
Viaarxiv icon