Picture for Cheyn Shmuel Shmidman

Cheyn Shmuel Shmidman

Do Pretrained Contextual Language Models Distinguish between Hebrew Homograph Analyses?

Add code
May 11, 2024
Figure 1 for Do Pretrained Contextual Language Models Distinguish between Hebrew Homograph Analyses?
Figure 2 for Do Pretrained Contextual Language Models Distinguish between Hebrew Homograph Analyses?
Figure 3 for Do Pretrained Contextual Language Models Distinguish between Hebrew Homograph Analyses?
Figure 4 for Do Pretrained Contextual Language Models Distinguish between Hebrew Homograph Analyses?
Viaarxiv icon

Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All

Add code
Nov 28, 2022
Figure 1 for Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All
Figure 2 for Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All
Figure 3 for Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All
Figure 4 for Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All
Viaarxiv icon

Introducing BEREL: BERT Embeddings for Rabbinic-Encoded Language

Add code
Aug 03, 2022
Viaarxiv icon