Meta-KD: A Meta Knowledge Distillation Framework for Language Model Compression across Domains

Add code
Dec 02, 2020
Figure 1 for Meta-KD: A Meta Knowledge Distillation Framework for Language Model Compression across Domains
Figure 2 for Meta-KD: A Meta Knowledge Distillation Framework for Language Model Compression across Domains
Figure 3 for Meta-KD: A Meta Knowledge Distillation Framework for Language Model Compression across Domains
Figure 4 for Meta-KD: A Meta Knowledge Distillation Framework for Language Model Compression across Domains

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: