MiniPLM: Knowledge Distillation for Pre-Training Language Models

Add code
Oct 22, 2024

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: