Picture for Xiangnan Chen

Xiangnan Chen

Negative Sampling with Adaptive Denoising Mixup for Knowledge Graph Embedding

Add code
Oct 15, 2023
Figure 1 for Negative Sampling with Adaptive Denoising Mixup for Knowledge Graph Embedding
Figure 2 for Negative Sampling with Adaptive Denoising Mixup for Knowledge Graph Embedding
Figure 3 for Negative Sampling with Adaptive Denoising Mixup for Knowledge Graph Embedding
Figure 4 for Negative Sampling with Adaptive Denoising Mixup for Knowledge Graph Embedding
Viaarxiv icon

Global Structure Knowledge-Guided Relation Extraction Method for Visually-Rich Document

Add code
May 23, 2023
Viaarxiv icon

Meta-Learning Based Knowledge Extrapolation for Knowledge Graphs in the Federated Setting

Add code
May 10, 2022
Figure 1 for Meta-Learning Based Knowledge Extrapolation for Knowledge Graphs in the Federated Setting
Figure 2 for Meta-Learning Based Knowledge Extrapolation for Knowledge Graphs in the Federated Setting
Figure 3 for Meta-Learning Based Knowledge Extrapolation for Knowledge Graphs in the Federated Setting
Figure 4 for Meta-Learning Based Knowledge Extrapolation for Knowledge Graphs in the Federated Setting
Viaarxiv icon

NeuralKG: An Open Source Library for Diverse Representation Learning of Knowledge Graphs

Add code
Feb 25, 2022
Figure 1 for NeuralKG: An Open Source Library for Diverse Representation Learning of Knowledge Graphs
Figure 2 for NeuralKG: An Open Source Library for Diverse Representation Learning of Knowledge Graphs
Figure 3 for NeuralKG: An Open Source Library for Diverse Representation Learning of Knowledge Graphs
Figure 4 for NeuralKG: An Open Source Library for Diverse Representation Learning of Knowledge Graphs
Viaarxiv icon

ZJUKLAB at SemEval-2021 Task 4: Negative Augmentation with Language Model for Reading Comprehension of Abstract Meaning

Add code
Feb 25, 2021
Figure 1 for ZJUKLAB at SemEval-2021 Task 4: Negative Augmentation with Language Model for Reading Comprehension of Abstract Meaning
Figure 2 for ZJUKLAB at SemEval-2021 Task 4: Negative Augmentation with Language Model for Reading Comprehension of Abstract Meaning
Figure 3 for ZJUKLAB at SemEval-2021 Task 4: Negative Augmentation with Language Model for Reading Comprehension of Abstract Meaning
Figure 4 for ZJUKLAB at SemEval-2021 Task 4: Negative Augmentation with Language Model for Reading Comprehension of Abstract Meaning
Viaarxiv icon