Self-Distillation for Model Stacking Unlocks Cross-Lingual NLU in 200+ Languages

Add code
Jun 18, 2024
Figure 1 for Self-Distillation for Model Stacking Unlocks Cross-Lingual NLU in 200+ Languages
Figure 2 for Self-Distillation for Model Stacking Unlocks Cross-Lingual NLU in 200+ Languages
Figure 3 for Self-Distillation for Model Stacking Unlocks Cross-Lingual NLU in 200+ Languages
Figure 4 for Self-Distillation for Model Stacking Unlocks Cross-Lingual NLU in 200+ Languages

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: