Attentive Student Meets Multi-Task Teacher: Improved Knowledge Distillation for Pretrained Models

Add code
Nov 09, 2019
Figure 1 for Attentive Student Meets Multi-Task Teacher: Improved Knowledge Distillation for Pretrained Models
Figure 2 for Attentive Student Meets Multi-Task Teacher: Improved Knowledge Distillation for Pretrained Models
Figure 3 for Attentive Student Meets Multi-Task Teacher: Improved Knowledge Distillation for Pretrained Models
Figure 4 for Attentive Student Meets Multi-Task Teacher: Improved Knowledge Distillation for Pretrained Models

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: