Picture for Haoyu Sheng

Haoyu Sheng

Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization

Add code
Apr 08, 2019
Figure 1 for Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization
Figure 2 for Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization
Figure 3 for Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization
Figure 4 for Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization
Viaarxiv icon