Picture for Joshua Guedalia

Joshua Guedalia

Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All

Add code
Nov 28, 2022
Figure 1 for Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All
Figure 2 for Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All
Figure 3 for Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All
Figure 4 for Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All
Viaarxiv icon

Introducing BEREL: BERT Embeddings for Rabbinic-Encoded Language

Add code
Aug 03, 2022
Viaarxiv icon

A Novel Challenge Set for Hebrew Morphological Disambiguation and Diacritics Restoration

Add code
Oct 06, 2020
Viaarxiv icon