Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer

Add code
Apr 29, 2021
Figure 1 for Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer
Figure 2 for Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer
Figure 3 for Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer
Figure 4 for Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: