Picture for Taiqiang Wu

Taiqiang Wu

MCUBERT: Memory-Efficient BERT Inference on Commodity Microcontrollers

Add code
Oct 23, 2024
Figure 1 for MCUBERT: Memory-Efficient BERT Inference on Commodity Microcontrollers
Figure 2 for MCUBERT: Memory-Efficient BERT Inference on Commodity Microcontrollers
Figure 3 for MCUBERT: Memory-Efficient BERT Inference on Commodity Microcontrollers
Figure 4 for MCUBERT: Memory-Efficient BERT Inference on Commodity Microcontrollers
Viaarxiv icon

A Survey on the Honesty of Large Language Models

Add code
Sep 27, 2024
Figure 1 for A Survey on the Honesty of Large Language Models
Figure 2 for A Survey on the Honesty of Large Language Models
Figure 3 for A Survey on the Honesty of Large Language Models
Figure 4 for A Survey on the Honesty of Large Language Models
Viaarxiv icon

LoCa: Logit Calibration for Knowledge Distillation

Add code
Sep 07, 2024
Viaarxiv icon

Mixture-of-Subspaces in Low-Rank Adaptation

Add code
Jun 16, 2024
Viaarxiv icon

Unchosen Experts Can Contribute Too: Unleashing MoE Models' Power by Self-Contrast

Add code
May 23, 2024
Viaarxiv icon

Rethinking Kullback-Leibler Divergence in Knowledge Distillation for Large Language Models

Add code
Apr 03, 2024
Figure 1 for Rethinking Kullback-Leibler Divergence in Knowledge Distillation for Large Language Models
Figure 2 for Rethinking Kullback-Leibler Divergence in Knowledge Distillation for Large Language Models
Figure 3 for Rethinking Kullback-Leibler Divergence in Knowledge Distillation for Large Language Models
Figure 4 for Rethinking Kullback-Leibler Divergence in Knowledge Distillation for Large Language Models
Viaarxiv icon

Recouple Event Field via Probabilistic Bias for Event Extraction

Add code
May 19, 2023
Figure 1 for Recouple Event Field via Probabilistic Bias for Event Extraction
Figure 2 for Recouple Event Field via Probabilistic Bias for Event Extraction
Figure 3 for Recouple Event Field via Probabilistic Bias for Event Extraction
Figure 4 for Recouple Event Field via Probabilistic Bias for Event Extraction
Viaarxiv icon

Weight-Inherited Distillation for Task-Agnostic BERT Compression

Add code
May 16, 2023
Viaarxiv icon

RIFormer: Keep Your Vision Backbone Effective While Removing Token Mixer

Add code
Apr 12, 2023
Viaarxiv icon

Edge-free but Structure-aware: Prototype-Guided Knowledge Distillation from GNNs to MLPs

Add code
Mar 27, 2023
Viaarxiv icon