Picture for Mehdi Ben Amor

Mehdi Ben Amor

Mixture of Modular Experts: Distilling Knowledge from a Multilingual Teacher into Specialized Modular Language Models

Add code
Jul 28, 2024
Figure 1 for Mixture of Modular Experts: Distilling Knowledge from a Multilingual Teacher into Specialized Modular Language Models
Figure 2 for Mixture of Modular Experts: Distilling Knowledge from a Multilingual Teacher into Specialized Modular Language Models
Figure 3 for Mixture of Modular Experts: Distilling Knowledge from a Multilingual Teacher into Specialized Modular Language Models
Figure 4 for Mixture of Modular Experts: Distilling Knowledge from a Multilingual Teacher into Specialized Modular Language Models
Viaarxiv icon

Towards Measuring Representational Similarity of Large Language Models

Add code
Dec 05, 2023
Viaarxiv icon

Impact of Position Bias on Language Models in Token Classification

Add code
Apr 26, 2023
Viaarxiv icon