Picture for Nan Hua

Nan Hua

Equipping Transformer with Random-Access Reading for Long-Context Understanding

Add code
May 21, 2024
Viaarxiv icon

Gemini 1.5: Unlocking multimodal understanding across millions of tokens of context

Add code
Mar 08, 2024
Viaarxiv icon

Attendre: Wait To Attend By Retrieval With Evicted Queries in Memory-Based Transformers for Long Context Processing

Add code
Jan 10, 2024
Viaarxiv icon

Gemini: A Family of Highly Capable Multimodal Models

Add code
Dec 19, 2023
Viaarxiv icon

LMDX: Language Model-based Document Information Extraction and Localization

Add code
Sep 19, 2023
Viaarxiv icon

FormNetV2: Multimodal Graph Contrastive Learning for Form Document Information Extraction

Add code
May 04, 2023
Figure 1 for FormNetV2: Multimodal Graph Contrastive Learning for Form Document Information Extraction
Figure 2 for FormNetV2: Multimodal Graph Contrastive Learning for Form Document Information Extraction
Figure 3 for FormNetV2: Multimodal Graph Contrastive Learning for Form Document Information Extraction
Figure 4 for FormNetV2: Multimodal Graph Contrastive Learning for Form Document Information Extraction
Viaarxiv icon

Protoformer: Embedding Prototypes for Transformers

Add code
Jun 25, 2022
Viaarxiv icon

FormNet: Structural Encoding beyond Sequential Modeling in Form Document Information Extraction

Add code
Mar 24, 2022
Figure 1 for FormNet: Structural Encoding beyond Sequential Modeling in Form Document Information Extraction
Figure 2 for FormNet: Structural Encoding beyond Sequential Modeling in Form Document Information Extraction
Figure 3 for FormNet: Structural Encoding beyond Sequential Modeling in Form Document Information Extraction
Figure 4 for FormNet: Structural Encoding beyond Sequential Modeling in Form Document Information Extraction
Viaarxiv icon

Pruning Redundant Mappings in Transformer Models via Spectral-Normalized Identity Prior

Add code
Oct 05, 2020
Figure 1 for Pruning Redundant Mappings in Transformer Models via Spectral-Normalized Identity Prior
Figure 2 for Pruning Redundant Mappings in Transformer Models via Spectral-Normalized Identity Prior
Figure 3 for Pruning Redundant Mappings in Transformer Models via Spectral-Normalized Identity Prior
Figure 4 for Pruning Redundant Mappings in Transformer Models via Spectral-Normalized Identity Prior
Viaarxiv icon

Universal Sentence Encoder

Add code
Apr 12, 2018
Figure 1 for Universal Sentence Encoder
Figure 2 for Universal Sentence Encoder
Figure 3 for Universal Sentence Encoder
Figure 4 for Universal Sentence Encoder
Viaarxiv icon