Picture for Anjali Sridhar

Anjali Sridhar

Towards MoE Deployment: Mitigating Inefficiencies in Mixture-of-Expert (MoE) Inference

Add code
Mar 10, 2023
Figure 1 for Towards MoE Deployment: Mitigating Inefficiencies in Mixture-of-Expert (MoE) Inference
Figure 2 for Towards MoE Deployment: Mitigating Inefficiencies in Mixture-of-Expert (MoE) Inference
Figure 3 for Towards MoE Deployment: Mitigating Inefficiencies in Mixture-of-Expert (MoE) Inference
Figure 4 for Towards MoE Deployment: Mitigating Inefficiencies in Mixture-of-Expert (MoE) Inference
Viaarxiv icon

OPT: Open Pre-trained Transformer Language Models

Add code
May 05, 2022
Figure 1 for OPT: Open Pre-trained Transformer Language Models
Figure 2 for OPT: Open Pre-trained Transformer Language Models
Figure 3 for OPT: Open Pre-trained Transformer Language Models
Figure 4 for OPT: Open Pre-trained Transformer Language Models
Viaarxiv icon