Picture for Kichun Lee

Kichun Lee

Knowledge Distillation for BERT Unsupervised Domain Adaptation

Add code
Oct 23, 2020
Figure 1 for Knowledge Distillation for BERT Unsupervised Domain Adaptation
Figure 2 for Knowledge Distillation for BERT Unsupervised Domain Adaptation
Figure 3 for Knowledge Distillation for BERT Unsupervised Domain Adaptation
Figure 4 for Knowledge Distillation for BERT Unsupervised Domain Adaptation
Viaarxiv icon