Picture for Zaida Zhou

Zaida Zhou

InternLM2 Technical Report

Add code
Mar 26, 2024
Viaarxiv icon

Channel Distillation: Channel-Wise Attention for Knowledge Distillation

Add code
Jun 02, 2020
Figure 1 for Channel Distillation: Channel-Wise Attention for Knowledge Distillation
Figure 2 for Channel Distillation: Channel-Wise Attention for Knowledge Distillation
Figure 3 for Channel Distillation: Channel-Wise Attention for Knowledge Distillation
Figure 4 for Channel Distillation: Channel-Wise Attention for Knowledge Distillation
Viaarxiv icon