Picture for Nadezhda Chirkova

Nadezhda Chirkova

HSE University, Russia

BERGEN: A Benchmarking Library for Retrieval-Augmented Generation

Add code
Jul 01, 2024
Viaarxiv icon

Investigating the potential of Sparse Mixtures-of-Experts for multi-domain neural machine translation

Add code
Jul 01, 2024
Viaarxiv icon

Retrieval-augmented generation in multilingual settings

Add code
Jul 01, 2024
Viaarxiv icon

Zero-shot cross-lingual transfer in instruction tuning of large language model

Add code
Feb 22, 2024
Viaarxiv icon

Key ingredients for effective zero-shot cross-lingual knowledge transfer in generative tasks

Add code
Feb 19, 2024
Viaarxiv icon

Empirical study of pretrained multilingual language models for zero-shot cross-lingual generation

Add code
Oct 15, 2023
Viaarxiv icon

CodeBPE: Investigating Subtokenization Options for Large Language Model Pretraining on Source Code

Add code
Aug 01, 2023
Viaarxiv icon

Should you marginalize over possible tokenizations?

Add code
Jun 30, 2023
Viaarxiv icon

Parameter-Efficient Finetuning of Transformers for Source Code

Add code
Dec 12, 2022
Viaarxiv icon

Probing Pretrained Models of Source Code

Add code
Feb 16, 2022
Figure 1 for Probing Pretrained Models of Source Code
Figure 2 for Probing Pretrained Models of Source Code
Figure 3 for Probing Pretrained Models of Source Code
Figure 4 for Probing Pretrained Models of Source Code
Viaarxiv icon