Picture for Dario Stojanovski

Dario Stojanovski

Language-Family Adapters for Multilingual Neural Machine Translation

Add code
Sep 30, 2022
Figure 1 for Language-Family Adapters for Multilingual Neural Machine Translation
Figure 2 for Language-Family Adapters for Multilingual Neural Machine Translation
Figure 3 for Language-Family Adapters for Multilingual Neural Machine Translation
Figure 4 for Language-Family Adapters for Multilingual Neural Machine Translation
Viaarxiv icon

Improving the Lexical Ability of Pretrained Language Models for Unsupervised Neural Machine Translation

Add code
Apr 14, 2021
Figure 1 for Improving the Lexical Ability of Pretrained Language Models for Unsupervised Neural Machine Translation
Figure 2 for Improving the Lexical Ability of Pretrained Language Models for Unsupervised Neural Machine Translation
Figure 3 for Improving the Lexical Ability of Pretrained Language Models for Unsupervised Neural Machine Translation
Figure 4 for Improving the Lexical Ability of Pretrained Language Models for Unsupervised Neural Machine Translation
Viaarxiv icon

The LMU Munich System for the WMT 2020 Unsupervised Machine Translation Shared Task

Add code
Oct 25, 2020
Figure 1 for The LMU Munich System for the WMT 2020 Unsupervised Machine Translation Shared Task
Figure 2 for The LMU Munich System for the WMT 2020 Unsupervised Machine Translation Shared Task
Viaarxiv icon

Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT

Add code
Oct 06, 2020
Figure 1 for Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT
Figure 2 for Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT
Figure 3 for Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT
Figure 4 for Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT
Viaarxiv icon

Addressing Zero-Resource Domains Using Document-Level Context in Neural Machine Translation

Add code
Apr 30, 2020
Figure 1 for Addressing Zero-Resource Domains Using Document-Level Context in Neural Machine Translation
Figure 2 for Addressing Zero-Resource Domains Using Document-Level Context in Neural Machine Translation
Figure 3 for Addressing Zero-Resource Domains Using Document-Level Context in Neural Machine Translation
Figure 4 for Addressing Zero-Resource Domains Using Document-Level Context in Neural Machine Translation
Viaarxiv icon