Picture for Gertjan van Noord

Gertjan van Noord

Groningen University

Are Character-level Translations Worth the Wait? An Extensive Comparison of Character- and Subword-level Models for Machine Translation

Add code
Feb 28, 2023
Viaarxiv icon

Subword-Delimited Downsampling for Better Character-Level Translation

Add code
Dec 02, 2022
Viaarxiv icon

Patching Leaks in the Charformer for Efficient Character-Level Generation

Add code
May 27, 2022
Figure 1 for Patching Leaks in the Charformer for Efficient Character-Level Generation
Figure 2 for Patching Leaks in the Charformer for Efficient Character-Level Generation
Figure 3 for Patching Leaks in the Charformer for Efficient Character-Level Generation
Figure 4 for Patching Leaks in the Charformer for Efficient Character-Level Generation
Viaarxiv icon

Hyper-X: A Unified Hypernetwork for Multi-Task Multilingual Transfer

Add code
May 24, 2022
Figure 1 for Hyper-X: A Unified Hypernetwork for Multi-Task Multilingual Transfer
Figure 2 for Hyper-X: A Unified Hypernetwork for Multi-Task Multilingual Transfer
Figure 3 for Hyper-X: A Unified Hypernetwork for Multi-Task Multilingual Transfer
Figure 4 for Hyper-X: A Unified Hypernetwork for Multi-Task Multilingual Transfer
Viaarxiv icon

The Importance of Context in Very Low Resource Language Modeling

Add code
May 10, 2022
Figure 1 for The Importance of Context in Very Low Resource Language Modeling
Figure 2 for The Importance of Context in Very Low Resource Language Modeling
Figure 3 for The Importance of Context in Very Low Resource Language Modeling
Figure 4 for The Importance of Context in Very Low Resource Language Modeling
Viaarxiv icon

Unsupervised Translation of German--Lower Sorbian: Exploring Training and Novel Transfer Methods on a Low-Resource Language

Add code
Sep 24, 2021
Figure 1 for Unsupervised Translation of German--Lower Sorbian: Exploring Training and Novel Transfer Methods on a Low-Resource Language
Figure 2 for Unsupervised Translation of German--Lower Sorbian: Exploring Training and Novel Transfer Methods on a Low-Resource Language
Figure 3 for Unsupervised Translation of German--Lower Sorbian: Exploring Training and Novel Transfer Methods on a Low-Resource Language
Figure 4 for Unsupervised Translation of German--Lower Sorbian: Exploring Training and Novel Transfer Methods on a Low-Resource Language
Viaarxiv icon

UDapter: Language Adaptation for Truly Universal Dependency Parsing

Add code
Apr 29, 2020
Figure 1 for UDapter: Language Adaptation for Truly Universal Dependency Parsing
Figure 2 for UDapter: Language Adaptation for Truly Universal Dependency Parsing
Figure 3 for UDapter: Language Adaptation for Truly Universal Dependency Parsing
Figure 4 for UDapter: Language Adaptation for Truly Universal Dependency Parsing
Viaarxiv icon

BERTje: A Dutch BERT Model

Add code
Dec 19, 2019
Figure 1 for BERTje: A Dutch BERT Model
Figure 2 for BERTje: A Dutch BERT Model
Figure 3 for BERTje: A Dutch BERT Model
Figure 4 for BERTje: A Dutch BERT Model
Viaarxiv icon

MoNoise: Modeling Noise Using a Modular Normalization System

Add code
Oct 10, 2017
Figure 1 for MoNoise: Modeling Noise Using a Modular Normalization System
Figure 2 for MoNoise: Modeling Noise Using a Modular Normalization System
Figure 3 for MoNoise: Modeling Noise Using a Modular Normalization System
Figure 4 for MoNoise: Modeling Noise Using a Modular Normalization System
Viaarxiv icon

Bilingual Learning of Multi-sense Embeddings with Discrete Autoencoders

Add code
Mar 30, 2016
Figure 1 for Bilingual Learning of Multi-sense Embeddings with Discrete Autoencoders
Figure 2 for Bilingual Learning of Multi-sense Embeddings with Discrete Autoencoders
Figure 3 for Bilingual Learning of Multi-sense Embeddings with Discrete Autoencoders
Figure 4 for Bilingual Learning of Multi-sense Embeddings with Discrete Autoencoders
Viaarxiv icon