PQK: Model Compression via Pruning, Quantization, and Knowledge Distillation

Add code
Jun 25, 2021
Figure 1 for PQK: Model Compression via Pruning, Quantization, and Knowledge Distillation
Figure 2 for PQK: Model Compression via Pruning, Quantization, and Knowledge Distillation
Figure 3 for PQK: Model Compression via Pruning, Quantization, and Knowledge Distillation

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: