Picture for Haitao Lin

Haitao Lin

National Laboratory of Pattern Recognition, Institute of Automation, CAS, Beijing, China, School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China

Learning to Model Graph Structural Information on MLPs via Graph Structure Self-Contrasting

Add code
Sep 09, 2024
Viaarxiv icon

Polaris: Open-ended Interactive Robotic Manipulation via Syn2Real Visual Grounding and Large Language Models

Add code
Aug 15, 2024
Viaarxiv icon

LAC-Net: Linear-Fusion Attention-Guided Convolutional Network for Accurate Robotic Grasping Under the Occlusion

Add code
Aug 06, 2024
Viaarxiv icon

Teach Harder, Learn Poorer: Rethinking Hard Sample Distillation for GNN-to-MLP Knowledge Distillation

Add code
Jul 20, 2024
Viaarxiv icon

CBGBench: Fill in the Blank of Protein-Molecule Complex Binding Graph

Add code
Jun 16, 2024
Figure 1 for CBGBench: Fill in the Blank of Protein-Molecule Complex Binding Graph
Figure 2 for CBGBench: Fill in the Blank of Protein-Molecule Complex Binding Graph
Figure 3 for CBGBench: Fill in the Blank of Protein-Molecule Complex Binding Graph
Figure 4 for CBGBench: Fill in the Blank of Protein-Molecule Complex Binding Graph
Viaarxiv icon

Learning to Predict Mutation Effects of Protein-Protein Interactions by Microenvironment-aware Hierarchical Prompt Learning

Add code
May 16, 2024
Viaarxiv icon

Deep Lead Optimization: Leveraging Generative AI for Structural Modification

Add code
Apr 30, 2024
Viaarxiv icon

LongVQ: Long Sequence Modeling with Vector Quantization on Structured Memory

Add code
Apr 18, 2024
Figure 1 for LongVQ: Long Sequence Modeling with Vector Quantization on Structured Memory
Figure 2 for LongVQ: Long Sequence Modeling with Vector Quantization on Structured Memory
Figure 3 for LongVQ: Long Sequence Modeling with Vector Quantization on Structured Memory
Figure 4 for LongVQ: Long Sequence Modeling with Vector Quantization on Structured Memory
Viaarxiv icon

A Teacher-Free Graph Knowledge Distillation Framework with Dual Self-Distillation

Add code
Mar 06, 2024
Figure 1 for A Teacher-Free Graph Knowledge Distillation Framework with Dual Self-Distillation
Figure 2 for A Teacher-Free Graph Knowledge Distillation Framework with Dual Self-Distillation
Figure 3 for A Teacher-Free Graph Knowledge Distillation Framework with Dual Self-Distillation
Figure 4 for A Teacher-Free Graph Knowledge Distillation Framework with Dual Self-Distillation
Viaarxiv icon

Decoupling Weighing and Selecting for Integrating Multiple Graph Pre-training Tasks

Add code
Mar 03, 2024
Figure 1 for Decoupling Weighing and Selecting for Integrating Multiple Graph Pre-training Tasks
Figure 2 for Decoupling Weighing and Selecting for Integrating Multiple Graph Pre-training Tasks
Figure 3 for Decoupling Weighing and Selecting for Integrating Multiple Graph Pre-training Tasks
Figure 4 for Decoupling Weighing and Selecting for Integrating Multiple Graph Pre-training Tasks
Viaarxiv icon