Picture for Ryan McDonald

Ryan McDonald

Sparse Rewards Can Self-Train Dialogue Agents

Add code
Sep 06, 2024
Figure 1 for Sparse Rewards Can Self-Train Dialogue Agents
Figure 2 for Sparse Rewards Can Self-Train Dialogue Agents
Figure 3 for Sparse Rewards Can Self-Train Dialogue Agents
Figure 4 for Sparse Rewards Can Self-Train Dialogue Agents
Viaarxiv icon

Multi-Step Dialogue Workflow Action Prediction

Add code
Nov 16, 2023
Viaarxiv icon

HeaP: Hierarchical Policies for Web Actions using LLMs

Add code
Oct 05, 2023
Figure 1 for HeaP: Hierarchical Policies for Web Actions using LLMs
Figure 2 for HeaP: Hierarchical Policies for Web Actions using LLMs
Figure 3 for HeaP: Hierarchical Policies for Web Actions using LLMs
Figure 4 for HeaP: Hierarchical Policies for Web Actions using LLMs
Viaarxiv icon

On the Effectiveness of Offline RL for Dialogue Response Generation

Add code
Jul 23, 2023
Viaarxiv icon

Long-term Control for Dialogue Generation: Methods and Evaluation

Add code
May 15, 2022
Figure 1 for Long-term Control for Dialogue Generation: Methods and Evaluation
Figure 2 for Long-term Control for Dialogue Generation: Methods and Evaluation
Figure 3 for Long-term Control for Dialogue Generation: Methods and Evaluation
Figure 4 for Long-term Control for Dialogue Generation: Methods and Evaluation
Viaarxiv icon

Wav2Seq: Pre-training Speech-to-Text Encoder-Decoder Models Using Pseudo Languages

Add code
May 02, 2022
Figure 1 for Wav2Seq: Pre-training Speech-to-Text Encoder-Decoder Models Using Pseudo Languages
Figure 2 for Wav2Seq: Pre-training Speech-to-Text Encoder-Decoder Models Using Pseudo Languages
Figure 3 for Wav2Seq: Pre-training Speech-to-Text Encoder-Decoder Models Using Pseudo Languages
Figure 4 for Wav2Seq: Pre-training Speech-to-Text Encoder-Decoder Models Using Pseudo Languages
Viaarxiv icon

Focus Attention: Promoting Faithfulness and Diversity in Summarization

Add code
May 25, 2021
Figure 1 for Focus Attention: Promoting Faithfulness and Diversity in Summarization
Figure 2 for Focus Attention: Promoting Faithfulness and Diversity in Summarization
Figure 3 for Focus Attention: Promoting Faithfulness and Diversity in Summarization
Figure 4 for Focus Attention: Promoting Faithfulness and Diversity in Summarization
Viaarxiv icon

Planning with Entity Chains for Abstractive Summarization

Add code
Apr 15, 2021
Figure 1 for Planning with Entity Chains for Abstractive Summarization
Figure 2 for Planning with Entity Chains for Abstractive Summarization
Figure 3 for Planning with Entity Chains for Abstractive Summarization
Figure 4 for Planning with Entity Chains for Abstractive Summarization
Viaarxiv icon

Stepwise Extractive Summarization and Planning with Structured Transformers

Add code
Oct 06, 2020
Figure 1 for Stepwise Extractive Summarization and Planning with Structured Transformers
Figure 2 for Stepwise Extractive Summarization and Planning with Structured Transformers
Figure 3 for Stepwise Extractive Summarization and Planning with Structured Transformers
Figure 4 for Stepwise Extractive Summarization and Planning with Structured Transformers
Viaarxiv icon

RRF102: Meeting the TREC-COVID Challenge with a 100+ Runs Ensemble

Add code
Oct 01, 2020
Figure 1 for RRF102: Meeting the TREC-COVID Challenge with a 100+ Runs Ensemble
Figure 2 for RRF102: Meeting the TREC-COVID Challenge with a 100+ Runs Ensemble
Figure 3 for RRF102: Meeting the TREC-COVID Challenge with a 100+ Runs Ensemble
Figure 4 for RRF102: Meeting the TREC-COVID Challenge with a 100+ Runs Ensemble
Viaarxiv icon