Picture for Glorianna Jagfeld

Glorianna Jagfeld

Understanding who uses Reddit: Profiling individuals with a self-reported bipolar disorder diagnosis

Add code
Apr 23, 2021
Figure 1 for Understanding who uses Reddit: Profiling individuals with a self-reported bipolar disorder diagnosis
Figure 2 for Understanding who uses Reddit: Profiling individuals with a self-reported bipolar disorder diagnosis
Figure 3 for Understanding who uses Reddit: Profiling individuals with a self-reported bipolar disorder diagnosis
Figure 4 for Understanding who uses Reddit: Profiling individuals with a self-reported bipolar disorder diagnosis
Viaarxiv icon

A computational linguistic study of personal recovery in bipolar disorder

Add code
Jun 03, 2019
Viaarxiv icon

Sequence-to-Sequence Models for Data-to-Text Natural Language Generation: Word- vs. Character-based Processing and Output Diversity

Add code
Oct 11, 2018
Figure 1 for Sequence-to-Sequence Models for Data-to-Text Natural Language Generation: Word- vs. Character-based Processing and Output Diversity
Figure 2 for Sequence-to-Sequence Models for Data-to-Text Natural Language Generation: Word- vs. Character-based Processing and Output Diversity
Figure 3 for Sequence-to-Sequence Models for Data-to-Text Natural Language Generation: Word- vs. Character-based Processing and Output Diversity
Figure 4 for Sequence-to-Sequence Models for Data-to-Text Natural Language Generation: Word- vs. Character-based Processing and Output Diversity
Viaarxiv icon

Comparing Attention-based Convolutional and Recurrent Neural Networks: Success and Limitations in Machine Reading Comprehension

Add code
Aug 27, 2018
Figure 1 for Comparing Attention-based Convolutional and Recurrent Neural Networks: Success and Limitations in Machine Reading Comprehension
Figure 2 for Comparing Attention-based Convolutional and Recurrent Neural Networks: Success and Limitations in Machine Reading Comprehension
Figure 3 for Comparing Attention-based Convolutional and Recurrent Neural Networks: Success and Limitations in Machine Reading Comprehension
Figure 4 for Comparing Attention-based Convolutional and Recurrent Neural Networks: Success and Limitations in Machine Reading Comprehension
Viaarxiv icon

Encoding Word Confusion Networks with Recurrent Neural Networks for Dialog State Tracking

Add code
Aug 09, 2017
Figure 1 for Encoding Word Confusion Networks with Recurrent Neural Networks for Dialog State Tracking
Figure 2 for Encoding Word Confusion Networks with Recurrent Neural Networks for Dialog State Tracking
Figure 3 for Encoding Word Confusion Networks with Recurrent Neural Networks for Dialog State Tracking
Figure 4 for Encoding Word Confusion Networks with Recurrent Neural Networks for Dialog State Tracking
Viaarxiv icon