Edge-augmented Graph Transformers: Global Self-attention is Enough for Graphs

Add code
Aug 07, 2021
Figure 1 for Edge-augmented Graph Transformers: Global Self-attention is Enough for Graphs
Figure 2 for Edge-augmented Graph Transformers: Global Self-attention is Enough for Graphs
Figure 3 for Edge-augmented Graph Transformers: Global Self-attention is Enough for Graphs
Figure 4 for Edge-augmented Graph Transformers: Global Self-attention is Enough for Graphs

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: