SHUOWEN-JIEZI: Linguistically Informed Tokenizers For Chinese Language Model Pretraining

Add code
Jun 01, 2021
Figure 1 for SHUOWEN-JIEZI: Linguistically Informed Tokenizers For Chinese Language Model Pretraining
Figure 2 for SHUOWEN-JIEZI: Linguistically Informed Tokenizers For Chinese Language Model Pretraining
Figure 3 for SHUOWEN-JIEZI: Linguistically Informed Tokenizers For Chinese Language Model Pretraining
Figure 4 for SHUOWEN-JIEZI: Linguistically Informed Tokenizers For Chinese Language Model Pretraining

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: