Picture for Zelin Yao

Zelin Yao

DA-MoE: Addressing Depth-Sensitivity in Graph-Level Analysis through Mixture of Experts

Add code
Nov 05, 2024
Figure 1 for DA-MoE: Addressing Depth-Sensitivity in Graph-Level Analysis through Mixture of Experts
Figure 2 for DA-MoE: Addressing Depth-Sensitivity in Graph-Level Analysis through Mixture of Experts
Figure 3 for DA-MoE: Addressing Depth-Sensitivity in Graph-Level Analysis through Mixture of Experts
Figure 4 for DA-MoE: Addressing Depth-Sensitivity in Graph-Level Analysis through Mixture of Experts
Viaarxiv icon

Dual-perspective Cross Contrastive Learning in Graph Transformers

Add code
Jun 01, 2024
Viaarxiv icon

Hi-GMAE: Hierarchical Graph Masked Autoencoders

Add code
May 17, 2024
Viaarxiv icon

Gradformer: Graph Transformer with Exponential Decay

Add code
Apr 24, 2024
Figure 1 for Gradformer: Graph Transformer with Exponential Decay
Figure 2 for Gradformer: Graph Transformer with Exponential Decay
Figure 3 for Gradformer: Graph Transformer with Exponential Decay
Figure 4 for Gradformer: Graph Transformer with Exponential Decay
Viaarxiv icon