Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization

Add code
Apr 08, 2019
Figure 1 for Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization
Figure 2 for Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization
Figure 3 for Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization
Figure 4 for Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: