Picture for NakYil Kim

NakYil Kim

Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation

Add code
May 19, 2021
Figure 1 for Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation
Figure 2 for Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation
Figure 3 for Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation
Figure 4 for Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation
Viaarxiv icon