Picture for Laura Pérez-Mayos

Laura Pérez-Mayos

How much pretraining data do language models need to learn syntax?

Add code
Sep 09, 2021
Figure 1 for How much pretraining data do language models need to learn syntax?
Figure 2 for How much pretraining data do language models need to learn syntax?
Figure 3 for How much pretraining data do language models need to learn syntax?
Figure 4 for How much pretraining data do language models need to learn syntax?
Viaarxiv icon

Assessing the Syntactic Capabilities of Transformer-based Multilingual Language Models

Add code
May 10, 2021
Figure 1 for Assessing the Syntactic Capabilities of Transformer-based Multilingual Language Models
Figure 2 for Assessing the Syntactic Capabilities of Transformer-based Multilingual Language Models
Figure 3 for Assessing the Syntactic Capabilities of Transformer-based Multilingual Language Models
Figure 4 for Assessing the Syntactic Capabilities of Transformer-based Multilingual Language Models
Viaarxiv icon

On the Evolution of Syntactic Information Encoded by BERT's Contextualized Representations

Add code
Feb 10, 2021
Figure 1 for On the Evolution of Syntactic Information Encoded by BERT's Contextualized Representations
Figure 2 for On the Evolution of Syntactic Information Encoded by BERT's Contextualized Representations
Figure 3 for On the Evolution of Syntactic Information Encoded by BERT's Contextualized Representations
Figure 4 for On the Evolution of Syntactic Information Encoded by BERT's Contextualized Representations
Viaarxiv icon