1.0K

Universal Graph Transformer Self-Attention Networks

The Web Conference (WWW), 2019
Abstract

The transformer has been extensively used in research domains such as computer vision, image processing, and natural language processing. The transformer, however, has not been actively used in graph neural networks. To this end, we introduce a transformer-based advanced GNN model, named UGformer, to learn graph representations. In particular, given an input graph, we present two UGformer variants. The first variant is to leverage the transformer on a set of sampled neighbors for each node, while the second is to leverage the transformer directly on the input graph. Experimental results demonstrate that our UGformer achieves state-of-the-art accuracies on well-known benchmark datasets for graph classification and inductive text classification. The code is available on Github: \url{https://github.com/daiquocnguyen/Graph-Transformer}.

View on arXiv
Comments on this paper