Picture for Nam Trung Pham

Nam Trung Pham

Condensed Sample-Guided Model Inversion for Knowledge Distillation

Add code
Aug 25, 2024
Viaarxiv icon

Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay

Add code
Jan 09, 2022
Figure 1 for Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay
Figure 2 for Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay
Figure 3 for Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay
Figure 4 for Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay
Viaarxiv icon

Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data

Add code
Aug 11, 2021
Figure 1 for Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data
Figure 2 for Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data
Figure 3 for Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data
Figure 4 for Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data
Viaarxiv icon