Picture for Md Shamim Hussain

Md Shamim Hussain

Triplet Interaction Improves Graph Transformers: Accurate Molecular Graph Learning with Triplet Graph Transformers

Add code
Feb 07, 2024
Viaarxiv icon

The Information Pathways Hypothesis: Transformers are Dynamic Self-Ensembles

Add code
Jun 02, 2023
Figure 1 for The Information Pathways Hypothesis: Transformers are Dynamic Self-Ensembles
Figure 2 for The Information Pathways Hypothesis: Transformers are Dynamic Self-Ensembles
Figure 3 for The Information Pathways Hypothesis: Transformers are Dynamic Self-Ensembles
Figure 4 for The Information Pathways Hypothesis: Transformers are Dynamic Self-Ensembles
Viaarxiv icon

Edge-augmented Graph Transformers: Global Self-attention is Enough for Graphs

Add code
Aug 07, 2021
Figure 1 for Edge-augmented Graph Transformers: Global Self-attention is Enough for Graphs
Figure 2 for Edge-augmented Graph Transformers: Global Self-attention is Enough for Graphs
Figure 3 for Edge-augmented Graph Transformers: Global Self-attention is Enough for Graphs
Figure 4 for Edge-augmented Graph Transformers: Global Self-attention is Enough for Graphs
Viaarxiv icon