Picture for Xiang Hu

Xiang Hu

Nova: An Iterative Planning and Search Approach to Enhance Novelty and Diversity of LLM Generated Ideas

Add code
Oct 18, 2024
Viaarxiv icon

Efficient Long-range Language Modeling with Self-supervised Causal Retrieval

Add code
Oct 02, 2024
Viaarxiv icon

Unsupervised Morphological Tree Tokenizer

Add code
Jun 21, 2024
Figure 1 for Unsupervised Morphological Tree Tokenizer
Figure 2 for Unsupervised Morphological Tree Tokenizer
Figure 3 for Unsupervised Morphological Tree Tokenizer
Figure 4 for Unsupervised Morphological Tree Tokenizer
Viaarxiv icon

Generative Pretrained Structured Transformers: Unsupervised Syntactic Language Models at Scale

Add code
Mar 18, 2024
Viaarxiv icon

Augmenting transformers with recursively composed multi-grained representations

Add code
Sep 28, 2023
Viaarxiv icon

A Multi-Grained Self-Interpretable Symbolic-Neural Model For Single/Multi-Labeled Text Classification

Add code
Mar 06, 2023
Viaarxiv icon

Fast-R2D2: A Pretrained Recursive Neural Network based on Pruned CKY for Grammar Induction and Text Representation

Add code
Mar 01, 2022
Figure 1 for Fast-R2D2: A Pretrained Recursive Neural Network based on Pruned CKY for Grammar Induction and Text Representation
Figure 2 for Fast-R2D2: A Pretrained Recursive Neural Network based on Pruned CKY for Grammar Induction and Text Representation
Figure 3 for Fast-R2D2: A Pretrained Recursive Neural Network based on Pruned CKY for Grammar Induction and Text Representation
Figure 4 for Fast-R2D2: A Pretrained Recursive Neural Network based on Pruned CKY for Grammar Induction and Text Representation
Viaarxiv icon

R2D2: Recursive Transformer based on Differentiable Tree for Interpretable Hierarchical Language Modeling

Add code
Jul 02, 2021
Figure 1 for R2D2: Recursive Transformer based on Differentiable Tree for Interpretable Hierarchical Language Modeling
Figure 2 for R2D2: Recursive Transformer based on Differentiable Tree for Interpretable Hierarchical Language Modeling
Figure 3 for R2D2: Recursive Transformer based on Differentiable Tree for Interpretable Hierarchical Language Modeling
Figure 4 for R2D2: Recursive Transformer based on Differentiable Tree for Interpretable Hierarchical Language Modeling
Viaarxiv icon

Interactive Question Clarification in Dialogue via Reinforcement Learning

Add code
Dec 17, 2020
Figure 1 for Interactive Question Clarification in Dialogue via Reinforcement Learning
Figure 2 for Interactive Question Clarification in Dialogue via Reinforcement Learning
Figure 3 for Interactive Question Clarification in Dialogue via Reinforcement Learning
Figure 4 for Interactive Question Clarification in Dialogue via Reinforcement Learning
Viaarxiv icon