Picture for Weiwei Lü

Weiwei Lü

Skywork-MoE: A Deep Dive into Training Techniques for Mixture-of-Experts Language Models

Add code
Jun 03, 2024
Figure 1 for Skywork-MoE: A Deep Dive into Training Techniques for Mixture-of-Experts Language Models
Figure 2 for Skywork-MoE: A Deep Dive into Training Techniques for Mixture-of-Experts Language Models
Figure 3 for Skywork-MoE: A Deep Dive into Training Techniques for Mixture-of-Experts Language Models
Figure 4 for Skywork-MoE: A Deep Dive into Training Techniques for Mixture-of-Experts Language Models
Viaarxiv icon

Skywork: A More Open Bilingual Foundation Model

Add code
Oct 30, 2023
Figure 1 for Skywork: A More Open Bilingual Foundation Model
Figure 2 for Skywork: A More Open Bilingual Foundation Model
Figure 3 for Skywork: A More Open Bilingual Foundation Model
Figure 4 for Skywork: A More Open Bilingual Foundation Model
Viaarxiv icon