ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2311.04653
  4. Cited By
Hybrid Focal and Full-Range Attention Based Graph Transformers

Hybrid Focal and Full-Range Attention Based Graph Transformers

8 November 2023
Minhong Zhu
Zhenhao Zhao
Weiran Cai
ArXivPDFHTML

Papers citing "Hybrid Focal and Full-Range Attention Based Graph Transformers"

5 / 5 papers shown
Title
Your Transformer May Not be as Powerful as You Expect
Your Transformer May Not be as Powerful as You Expect
Shengjie Luo
Shanda Li
Shuxin Zheng
Tie-Yan Liu
Liwei Wang
Di He
52
50
0
26 May 2022
GRPE: Relative Positional Encoding for Graph Transformer
GRPE: Relative Positional Encoding for Graph Transformer
Wonpyo Park
Woonggi Chang
Donggeon Lee
Juntae Kim
Seung-won Hwang
39
74
0
30 Jan 2022
Benchmarking Graph Neural Networks
Benchmarking Graph Neural Networks
Vijay Prakash Dwivedi
Chaitanya K. Joshi
Anh Tuan Luu
T. Laurent
Yoshua Bengio
Xavier Bresson
173
907
0
02 Mar 2020
Geom-GCN: Geometric Graph Convolutional Networks
Geom-GCN: Geometric Graph Convolutional Networks
Hongbin Pei
Bingzhen Wei
Kevin Chen-Chuan Chang
Yu Lei
Bo Yang
GNN
167
1,058
0
13 Feb 2020
Multi-scale Attributed Node Embedding
Multi-scale Attributed Node Embedding
Benedek Rozemberczki
Carl Allen
Rik Sarkar
GNN
139
828
0
28 Sep 2019
1