TwHIN-BERT: A Socially-Enriched Pre-trained Language Model for Multilingual Tweet Representations

Add code
Sep 15, 2022
Figure 1 for TwHIN-BERT: A Socially-Enriched Pre-trained Language Model for Multilingual Tweet Representations
Figure 2 for TwHIN-BERT: A Socially-Enriched Pre-trained Language Model for Multilingual Tweet Representations
Figure 3 for TwHIN-BERT: A Socially-Enriched Pre-trained Language Model for Multilingual Tweet Representations
Figure 4 for TwHIN-BERT: A Socially-Enriched Pre-trained Language Model for Multilingual Tweet Representations

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: