Picture for Shaojie Jiang

Shaojie Jiang

A Simple Contrastive Learning Objective for Alleviating Neural Text Degeneration

Add code
May 19, 2022
Figure 1 for A Simple Contrastive Learning Objective for Alleviating Neural Text Degeneration
Figure 2 for A Simple Contrastive Learning Objective for Alleviating Neural Text Degeneration
Figure 3 for A Simple Contrastive Learning Objective for Alleviating Neural Text Degeneration
Figure 4 for A Simple Contrastive Learning Objective for Alleviating Neural Text Degeneration
Viaarxiv icon

TLDR: Token Loss Dynamic Reweighting for Reducing Repetitive Utterance Generation

Add code
Apr 09, 2020
Figure 1 for TLDR: Token Loss Dynamic Reweighting for Reducing Repetitive Utterance Generation
Figure 2 for TLDR: Token Loss Dynamic Reweighting for Reducing Repetitive Utterance Generation
Figure 3 for TLDR: Token Loss Dynamic Reweighting for Reducing Repetitive Utterance Generation
Figure 4 for TLDR: Token Loss Dynamic Reweighting for Reducing Repetitive Utterance Generation
Viaarxiv icon

Improving Neural Response Diversity with Frequency-Aware Cross-Entropy Loss

Add code
Feb 25, 2019
Figure 1 for Improving Neural Response Diversity with Frequency-Aware Cross-Entropy Loss
Figure 2 for Improving Neural Response Diversity with Frequency-Aware Cross-Entropy Loss
Figure 3 for Improving Neural Response Diversity with Frequency-Aware Cross-Entropy Loss
Figure 4 for Improving Neural Response Diversity with Frequency-Aware Cross-Entropy Loss
Viaarxiv icon

Why are Sequence-to-Sequence Models So Dull? Understanding the Low-Diversity Problem of Chatbots

Add code
Sep 06, 2018
Figure 1 for Why are Sequence-to-Sequence Models So Dull? Understanding the Low-Diversity Problem of Chatbots
Figure 2 for Why are Sequence-to-Sequence Models So Dull? Understanding the Low-Diversity Problem of Chatbots
Viaarxiv icon