Picture for Young Jin Kim

Young Jin Kim

Division of Cardiovascular Medicine, Radcliffe Department of Medicine, University of Oxford, Department of Radiology, Severance Hospital, South Korea

GRIN: GRadient-INformed MoE

Add code
Sep 18, 2024
Figure 1 for GRIN: GRadient-INformed MoE
Figure 2 for GRIN: GRadient-INformed MoE
Figure 3 for GRIN: GRadient-INformed MoE
Figure 4 for GRIN: GRadient-INformed MoE
Viaarxiv icon

Contrastive Preference Optimization: Pushing the Boundaries of LLM Performance in Machine Translation

Add code
Feb 02, 2024
Viaarxiv icon

PEMA: Plug-in External Memory Adaptation for Language Models

Add code
Nov 14, 2023
Viaarxiv icon

Mixture of Quantized Experts (MoQE): Complementary Effect of Low-bit Quantization and Robustness

Add code
Oct 03, 2023
Viaarxiv icon

A Paradigm Shift in Machine Translation: Boosting Translation Performance of Large Language Models

Add code
Sep 20, 2023
Viaarxiv icon

Task-Based MoE for Multitask Multilingual Machine Translation

Add code
Sep 11, 2023
Viaarxiv icon

FineQuant: Unlocking Efficiency with Fine-Grained Weight-Only Quantization for LLMs

Add code
Aug 16, 2023
Viaarxiv icon

How Good Are GPT Models at Machine Translation? A Comprehensive Evaluation

Add code
Feb 18, 2023
Figure 1 for How Good Are GPT Models at Machine Translation? A Comprehensive Evaluation
Figure 2 for How Good Are GPT Models at Machine Translation? A Comprehensive Evaluation
Figure 3 for How Good Are GPT Models at Machine Translation? A Comprehensive Evaluation
Figure 4 for How Good Are GPT Models at Machine Translation? A Comprehensive Evaluation
Viaarxiv icon

Who Says Elephants Can't Run: Bringing Large Scale MoE Models into Cloud Scale Production

Add code
Nov 18, 2022
Viaarxiv icon

AutoMoE: Neural Architecture Search for Efficient Sparsely Activated Transformers

Add code
Oct 14, 2022
Figure 1 for AutoMoE: Neural Architecture Search for Efficient Sparsely Activated Transformers
Figure 2 for AutoMoE: Neural Architecture Search for Efficient Sparsely Activated Transformers
Figure 3 for AutoMoE: Neural Architecture Search for Efficient Sparsely Activated Transformers
Figure 4 for AutoMoE: Neural Architecture Search for Efficient Sparsely Activated Transformers
Viaarxiv icon