Picture for Zongmin Yang

Zongmin Yang

Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer

Add code
Apr 29, 2021
Figure 1 for Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer
Figure 2 for Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer
Figure 3 for Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer
Figure 4 for Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer
Viaarxiv icon

Spirit Distillation: Precise Real-time Semantic Segmentation of Road Scenes with Insufficient Data

Add code
Apr 17, 2021
Figure 1 for Spirit Distillation: Precise Real-time Semantic Segmentation of Road Scenes with Insufficient Data
Figure 2 for Spirit Distillation: Precise Real-time Semantic Segmentation of Road Scenes with Insufficient Data
Figure 3 for Spirit Distillation: Precise Real-time Semantic Segmentation of Road Scenes with Insufficient Data
Figure 4 for Spirit Distillation: Precise Real-time Semantic Segmentation of Road Scenes with Insufficient Data
Viaarxiv icon

Activation Map Adaptation for Effective Knowledge Distillation

Add code
Oct 26, 2020
Figure 1 for Activation Map Adaptation for Effective Knowledge Distillation
Viaarxiv icon