LRC-BERT: Latent-representation Contrastive Knowledge Distillation for Natural Language Understanding

Add code
Dec 14, 2020
Figure 1 for LRC-BERT: Latent-representation Contrastive Knowledge Distillation for Natural Language Understanding
Figure 2 for LRC-BERT: Latent-representation Contrastive Knowledge Distillation for Natural Language Understanding
Figure 3 for LRC-BERT: Latent-representation Contrastive Knowledge Distillation for Natural Language Understanding
Figure 4 for LRC-BERT: Latent-representation Contrastive Knowledge Distillation for Natural Language Understanding

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: