Picture for Maciej Pióro

Maciej Pióro

Joint MoE Scaling Laws: Mixture of Experts Can Be Memory Efficient

Add code
Feb 07, 2025
Figure 1 for Joint MoE Scaling Laws: Mixture of Experts Can Be Memory Efficient
Figure 2 for Joint MoE Scaling Laws: Mixture of Experts Can Be Memory Efficient
Figure 3 for Joint MoE Scaling Laws: Mixture of Experts Can Be Memory Efficient
Figure 4 for Joint MoE Scaling Laws: Mixture of Experts Can Be Memory Efficient
Viaarxiv icon

State Soup: In-Context Skill Learning, Retrieval and Mixing

Add code
Jun 12, 2024
Viaarxiv icon

Scaling Laws for Fine-Grained Mixture of Experts

Add code
Feb 12, 2024
Figure 1 for Scaling Laws for Fine-Grained Mixture of Experts
Figure 2 for Scaling Laws for Fine-Grained Mixture of Experts
Figure 3 for Scaling Laws for Fine-Grained Mixture of Experts
Figure 4 for Scaling Laws for Fine-Grained Mixture of Experts
Viaarxiv icon

MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts

Add code
Jan 08, 2024
Figure 1 for MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts
Figure 2 for MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts
Figure 3 for MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts
Figure 4 for MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts
Viaarxiv icon

Mixture of Tokens: Efficient LLMs through Cross-Example Aggregation

Add code
Oct 24, 2023
Viaarxiv icon