Picture for Guiquan Liu

Guiquan Liu

ChemEval: A Comprehensive Multi-Level Chemical Evaluation for Large Language Models

Add code
Sep 21, 2024
Figure 1 for ChemEval: A Comprehensive Multi-Level Chemical Evaluation for Large Language Models
Figure 2 for ChemEval: A Comprehensive Multi-Level Chemical Evaluation for Large Language Models
Figure 3 for ChemEval: A Comprehensive Multi-Level Chemical Evaluation for Large Language Models
Figure 4 for ChemEval: A Comprehensive Multi-Level Chemical Evaluation for Large Language Models
Viaarxiv icon

Editing Knowledge Representation of Language Lodel via Rephrased Prefix Prompts

Add code
Mar 21, 2024
Viaarxiv icon

Locating and Mitigating Gender Bias in Large Language Models

Add code
Mar 21, 2024
Viaarxiv icon

Hierarchy-Aware T5 with Path-Adaptive Mask Mechanism for Hierarchical Text Classification

Add code
Sep 17, 2021
Figure 1 for Hierarchy-Aware T5 with Path-Adaptive Mask Mechanism for Hierarchical Text Classification
Figure 2 for Hierarchy-Aware T5 with Path-Adaptive Mask Mechanism for Hierarchical Text Classification
Figure 3 for Hierarchy-Aware T5 with Path-Adaptive Mask Mechanism for Hierarchical Text Classification
Figure 4 for Hierarchy-Aware T5 with Path-Adaptive Mask Mechanism for Hierarchical Text Classification
Viaarxiv icon

Denoising User-aware Memory Network for Recommendation

Add code
Jul 12, 2021
Figure 1 for Denoising User-aware Memory Network for Recommendation
Figure 2 for Denoising User-aware Memory Network for Recommendation
Figure 3 for Denoising User-aware Memory Network for Recommendation
Figure 4 for Denoising User-aware Memory Network for Recommendation
Viaarxiv icon

LRC-BERT: Latent-representation Contrastive Knowledge Distillation for Natural Language Understanding

Add code
Dec 14, 2020
Figure 1 for LRC-BERT: Latent-representation Contrastive Knowledge Distillation for Natural Language Understanding
Figure 2 for LRC-BERT: Latent-representation Contrastive Knowledge Distillation for Natural Language Understanding
Figure 3 for LRC-BERT: Latent-representation Contrastive Knowledge Distillation for Natural Language Understanding
Figure 4 for LRC-BERT: Latent-representation Contrastive Knowledge Distillation for Natural Language Understanding
Viaarxiv icon

Privacy Preserving PCA for Multiparty Modeling

Add code
Feb 09, 2020
Figure 1 for Privacy Preserving PCA for Multiparty Modeling
Figure 2 for Privacy Preserving PCA for Multiparty Modeling
Figure 3 for Privacy Preserving PCA for Multiparty Modeling
Viaarxiv icon