Picture for Jianxin Wu

Jianxin Wu

All You Need in Knowledge Distillation Is a Tailored Coordinate System

Add code
Dec 12, 2024
Viaarxiv icon

Quantization without Tears

Add code
Nov 22, 2024
Figure 1 for Quantization without Tears
Figure 2 for Quantization without Tears
Figure 3 for Quantization without Tears
Figure 4 for Quantization without Tears
Viaarxiv icon

Diffusion Product Quantization

Add code
Nov 19, 2024
Viaarxiv icon

Minimal Interaction Edge Tuning: A New Paradigm for Visual Adaptation

Add code
Jun 26, 2024
Viaarxiv icon

Effectively Compress KV Heads for LLM

Add code
Jun 11, 2024
Viaarxiv icon

Unified Low-rank Compression Framework for Click-through Rate Prediction

Add code
May 28, 2024
Viaarxiv icon

On Improving the Algorithm-, Model-, and Data- Efficiency of Self-Supervised Learning

Add code
Apr 30, 2024
Viaarxiv icon

Dense Vision Transformer Compression with Few Samples

Add code
Mar 27, 2024
Viaarxiv icon

DiffuLT: How to Make Diffusion Model Useful for Long-tail Recognition

Add code
Mar 08, 2024
Viaarxiv icon

Low-rank Attention Side-Tuning for Parameter-Efficient Fine-Tuning

Add code
Feb 06, 2024
Viaarxiv icon