A Systematic Study of Knowledge Distillation for Natural Language Generation with Pseudo-Target Training

Add code
May 03, 2023
Figure 1 for A Systematic Study of Knowledge Distillation for Natural Language Generation with Pseudo-Target Training
Figure 2 for A Systematic Study of Knowledge Distillation for Natural Language Generation with Pseudo-Target Training
Figure 3 for A Systematic Study of Knowledge Distillation for Natural Language Generation with Pseudo-Target Training
Figure 4 for A Systematic Study of Knowledge Distillation for Natural Language Generation with Pseudo-Target Training

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: