Picture for Shih-Yang Liu

Shih-Yang Liu

EoRA: Training-free Compensation for Compressed LLM with Eigenspace Low-Rank Approximation

Add code
Oct 28, 2024
Viaarxiv icon

RoLoRA: Fine-tuning Rotated Outlier-free LLMs for Effective Weight-Activation Quantization

Add code
Jul 10, 2024
Viaarxiv icon

Genetic Quantization-Aware Approximation for Non-Linear Operations in Transformers

Add code
Mar 29, 2024
Viaarxiv icon

DoRA: Weight-Decomposed Low-Rank Adaptation

Add code
Feb 14, 2024
Viaarxiv icon

Efficient Quantization-aware Training with Adaptive Coreset Selection

Add code
Jun 12, 2023
Viaarxiv icon

Oscillation-free Quantization for Low-bit Vision Transformers

Add code
Feb 04, 2023
Viaarxiv icon