MixCE: Training Autoregressive Language Models by Mixing Forward and Reverse Cross-Entropies

Add code
May 26, 2023

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: