On the importance of pre-training data volume for compact language models

Add code
Oct 09, 2020
Figure 1 for On the importance of pre-training data volume for compact language models
Figure 2 for On the importance of pre-training data volume for compact language models
Figure 3 for On the importance of pre-training data volume for compact language models

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: