Picture for Ikuya Yamada

Ikuya Yamada

LEIA: Facilitating Cross-Lingual Knowledge Transfer in Language Models with Entity-based Data Augmentation

Add code
Feb 18, 2024
Figure 1 for LEIA: Facilitating Cross-Lingual Knowledge Transfer in Language Models with Entity-based Data Augmentation
Figure 2 for LEIA: Facilitating Cross-Lingual Knowledge Transfer in Language Models with Entity-based Data Augmentation
Figure 3 for LEIA: Facilitating Cross-Lingual Knowledge Transfer in Language Models with Entity-based Data Augmentation
Figure 4 for LEIA: Facilitating Cross-Lingual Knowledge Transfer in Language Models with Entity-based Data Augmentation
Viaarxiv icon

Arukikata Travelogue Dataset with Geographic Entity Mention, Coreference, and Link Annotation

Add code
May 23, 2023
Viaarxiv icon

MIA 2022 Shared Task: Evaluating Cross-lingual Open-Retrieval Question Answering for 16 Diverse Languages

Add code
Jul 02, 2022
Figure 1 for MIA 2022 Shared Task: Evaluating Cross-lingual Open-Retrieval Question Answering for 16 Diverse Languages
Figure 2 for MIA 2022 Shared Task: Evaluating Cross-lingual Open-Retrieval Question Answering for 16 Diverse Languages
Figure 3 for MIA 2022 Shared Task: Evaluating Cross-lingual Open-Retrieval Question Answering for 16 Diverse Languages
Figure 4 for MIA 2022 Shared Task: Evaluating Cross-lingual Open-Retrieval Question Answering for 16 Diverse Languages
Viaarxiv icon

EASE: Entity-Aware Contrastive Learning of Sentence Embedding

Add code
May 09, 2022
Figure 1 for EASE: Entity-Aware Contrastive Learning of Sentence Embedding
Figure 2 for EASE: Entity-Aware Contrastive Learning of Sentence Embedding
Figure 3 for EASE: Entity-Aware Contrastive Learning of Sentence Embedding
Figure 4 for EASE: Entity-Aware Contrastive Learning of Sentence Embedding
Viaarxiv icon

mLUKE: The Power of Entity Representations in Multilingual Pretrained Language Models

Add code
Oct 15, 2021
Figure 1 for mLUKE: The Power of Entity Representations in Multilingual Pretrained Language Models
Figure 2 for mLUKE: The Power of Entity Representations in Multilingual Pretrained Language Models
Figure 3 for mLUKE: The Power of Entity Representations in Multilingual Pretrained Language Models
Figure 4 for mLUKE: The Power of Entity Representations in Multilingual Pretrained Language Models
Viaarxiv icon

A Multilingual Bag-of-Entities Model for Zero-Shot Cross-Lingual Text Classification

Add code
Oct 15, 2021
Figure 1 for A Multilingual Bag-of-Entities Model for Zero-Shot Cross-Lingual Text Classification
Figure 2 for A Multilingual Bag-of-Entities Model for Zero-Shot Cross-Lingual Text Classification
Figure 3 for A Multilingual Bag-of-Entities Model for Zero-Shot Cross-Lingual Text Classification
Figure 4 for A Multilingual Bag-of-Entities Model for Zero-Shot Cross-Lingual Text Classification
Viaarxiv icon

Efficient Passage Retrieval with Hashing for Open-domain Question Answering

Add code
Jun 02, 2021
Figure 1 for Efficient Passage Retrieval with Hashing for Open-domain Question Answering
Figure 2 for Efficient Passage Retrieval with Hashing for Open-domain Question Answering
Figure 3 for Efficient Passage Retrieval with Hashing for Open-domain Question Answering
Figure 4 for Efficient Passage Retrieval with Hashing for Open-domain Question Answering
Viaarxiv icon

NeurIPS 2020 EfficientQA Competition: Systems, Analyses and Lessons Learned

Add code
Jan 01, 2021
Figure 1 for NeurIPS 2020 EfficientQA Competition: Systems, Analyses and Lessons Learned
Figure 2 for NeurIPS 2020 EfficientQA Competition: Systems, Analyses and Lessons Learned
Figure 3 for NeurIPS 2020 EfficientQA Competition: Systems, Analyses and Lessons Learned
Figure 4 for NeurIPS 2020 EfficientQA Competition: Systems, Analyses and Lessons Learned
Viaarxiv icon

LUKE: Deep Contextualized Entity Representations with Entity-aware Self-attention

Add code
Oct 02, 2020
Figure 1 for LUKE: Deep Contextualized Entity Representations with Entity-aware Self-attention
Figure 2 for LUKE: Deep Contextualized Entity Representations with Entity-aware Self-attention
Figure 3 for LUKE: Deep Contextualized Entity Representations with Entity-aware Self-attention
Figure 4 for LUKE: Deep Contextualized Entity Representations with Entity-aware Self-attention
Viaarxiv icon

Neural Attentive Bag-of-Entities Model for Text Classification

Add code
Sep 10, 2019
Figure 1 for Neural Attentive Bag-of-Entities Model for Text Classification
Figure 2 for Neural Attentive Bag-of-Entities Model for Text Classification
Figure 3 for Neural Attentive Bag-of-Entities Model for Text Classification
Figure 4 for Neural Attentive Bag-of-Entities Model for Text Classification
Viaarxiv icon