Picture for He Yan

He Yan

CNNSum: Exploring Long-Context Summarization with Large Language Models in Chinese Novels

Add code
Dec 11, 2024
Figure 1 for CNNSum: Exploring Long-Context Summarization with Large Language Models in Chinese Novels
Figure 2 for CNNSum: Exploring Long-Context Summarization with Large Language Models in Chinese Novels
Figure 3 for CNNSum: Exploring Long-Context Summarization with Large Language Models in Chinese Novels
Figure 4 for CNNSum: Exploring Long-Context Summarization with Large Language Models in Chinese Novels
Viaarxiv icon

CNNSum: Exploring Long-Conext Summarization with Large Language Models in Chinese Novels

Add code
Dec 05, 2024
Figure 1 for CNNSum: Exploring Long-Conext Summarization with Large Language Models in Chinese Novels
Figure 2 for CNNSum: Exploring Long-Conext Summarization with Large Language Models in Chinese Novels
Figure 3 for CNNSum: Exploring Long-Conext Summarization with Large Language Models in Chinese Novels
Figure 4 for CNNSum: Exploring Long-Conext Summarization with Large Language Models in Chinese Novels
Viaarxiv icon

LIFBench: Evaluating the Instruction Following Performance and Stability of Large Language Models in Long-Context Scenarios

Add code
Nov 11, 2024
Figure 1 for LIFBench: Evaluating the Instruction Following Performance and Stability of Large Language Models in Long-Context Scenarios
Figure 2 for LIFBench: Evaluating the Instruction Following Performance and Stability of Large Language Models in Long-Context Scenarios
Figure 3 for LIFBench: Evaluating the Instruction Following Performance and Stability of Large Language Models in Long-Context Scenarios
Figure 4 for LIFBench: Evaluating the Instruction Following Performance and Stability of Large Language Models in Long-Context Scenarios
Viaarxiv icon

Enhancing SPARQL Generation by Triplet-order-sensitive Pre-training

Add code
Oct 08, 2024
Figure 1 for Enhancing SPARQL Generation by Triplet-order-sensitive Pre-training
Figure 2 for Enhancing SPARQL Generation by Triplet-order-sensitive Pre-training
Figure 3 for Enhancing SPARQL Generation by Triplet-order-sensitive Pre-training
Figure 4 for Enhancing SPARQL Generation by Triplet-order-sensitive Pre-training
Viaarxiv icon

Inherent limitations of LLMs regarding spatial information

Add code
Dec 05, 2023
Figure 1 for Inherent limitations of LLMs regarding spatial information
Figure 2 for Inherent limitations of LLMs regarding spatial information
Figure 3 for Inherent limitations of LLMs regarding spatial information
Figure 4 for Inherent limitations of LLMs regarding spatial information
Viaarxiv icon

ClothFormer:Taming Video Virtual Try-on in All Module

Add code
Apr 26, 2022
Figure 1 for ClothFormer:Taming Video Virtual Try-on in All Module
Figure 2 for ClothFormer:Taming Video Virtual Try-on in All Module
Figure 3 for ClothFormer:Taming Video Virtual Try-on in All Module
Figure 4 for ClothFormer:Taming Video Virtual Try-on in All Module
Viaarxiv icon

Unknown Identity Rejection Loss: Utilizing Unlabeled Data for Face Recognition

Add code
Oct 24, 2019
Figure 1 for Unknown Identity Rejection Loss: Utilizing Unlabeled Data for Face Recognition
Figure 2 for Unknown Identity Rejection Loss: Utilizing Unlabeled Data for Face Recognition
Figure 3 for Unknown Identity Rejection Loss: Utilizing Unlabeled Data for Face Recognition
Figure 4 for Unknown Identity Rejection Loss: Utilizing Unlabeled Data for Face Recognition
Viaarxiv icon

iQIYI-VID: A Large Dataset for Multi-modal Person Identification

Add code
Nov 19, 2018
Figure 1 for iQIYI-VID: A Large Dataset for Multi-modal Person Identification
Figure 2 for iQIYI-VID: A Large Dataset for Multi-modal Person Identification
Figure 3 for iQIYI-VID: A Large Dataset for Multi-modal Person Identification
Figure 4 for iQIYI-VID: A Large Dataset for Multi-modal Person Identification
Viaarxiv icon

Learning Latent Events from Network Message Logs: A Decomposition Based Approach

Add code
Apr 10, 2018
Figure 1 for Learning Latent Events from Network Message Logs: A Decomposition Based Approach
Viaarxiv icon