Talking Models: Distill Pre-trained Knowledge to Downstream Models via Interactive Communication

Add code
Oct 04, 2023
Figure 1 for Talking Models: Distill Pre-trained Knowledge to Downstream Models via Interactive Communication
Figure 2 for Talking Models: Distill Pre-trained Knowledge to Downstream Models via Interactive Communication
Figure 3 for Talking Models: Distill Pre-trained Knowledge to Downstream Models via Interactive Communication
Figure 4 for Talking Models: Distill Pre-trained Knowledge to Downstream Models via Interactive Communication

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: