Rethinking Kullback-Leibler Divergence in Knowledge Distillation for Large Language Models

Add code
Apr 03, 2024
Figure 1 for Rethinking Kullback-Leibler Divergence in Knowledge Distillation for Large Language Models
Figure 2 for Rethinking Kullback-Leibler Divergence in Knowledge Distillation for Large Language Models
Figure 3 for Rethinking Kullback-Leibler Divergence in Knowledge Distillation for Large Language Models
Figure 4 for Rethinking Kullback-Leibler Divergence in Knowledge Distillation for Large Language Models

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: