Picture for Alexander Min

Alexander Min

Co-training and Co-distillation for Quality Improvement and Compression of Language Models

Add code
Nov 07, 2023
Figure 1 for Co-training and Co-distillation for Quality Improvement and Compression of Language Models
Figure 2 for Co-training and Co-distillation for Quality Improvement and Compression of Language Models
Figure 3 for Co-training and Co-distillation for Quality Improvement and Compression of Language Models
Figure 4 for Co-training and Co-distillation for Quality Improvement and Compression of Language Models
Viaarxiv icon

A Study on Knowledge Distillation from Weak Teacher for Scaling Up Pre-trained Language Models

Add code
May 26, 2023
Viaarxiv icon