Picture for Thomas Gueudre

Thomas Gueudre

Knowledge Distillation Transfer Sets and their Impact on Downstream NLU Tasks

Add code
Oct 11, 2022
Figure 1 for Knowledge Distillation Transfer Sets and their Impact on Downstream NLU Tasks
Figure 2 for Knowledge Distillation Transfer Sets and their Impact on Downstream NLU Tasks
Figure 3 for Knowledge Distillation Transfer Sets and their Impact on Downstream NLU Tasks
Figure 4 for Knowledge Distillation Transfer Sets and their Impact on Downstream NLU Tasks
Viaarxiv icon

Alexa Teacher Model: Pretraining and Distilling Multi-Billion-Parameter Encoders for Natural Language Understanding Systems

Add code
Jun 15, 2022
Figure 1 for Alexa Teacher Model: Pretraining and Distilling Multi-Billion-Parameter Encoders for Natural Language Understanding Systems
Figure 2 for Alexa Teacher Model: Pretraining and Distilling Multi-Billion-Parameter Encoders for Natural Language Understanding Systems
Figure 3 for Alexa Teacher Model: Pretraining and Distilling Multi-Billion-Parameter Encoders for Natural Language Understanding Systems
Figure 4 for Alexa Teacher Model: Pretraining and Distilling Multi-Billion-Parameter Encoders for Natural Language Understanding Systems
Viaarxiv icon