233

Graph-to-Graph Transformer for Transition-based Dependency Parsing

Findings (Findings), 2019
Abstract

We propose the Graph2Graph Transformer architecture for conditioning on and predicting arbitrary graphs, and apply it to the challenging task of transition-based dependency parsing. After proposing a novel Transformer model of transition-based dependency parsing, we show that the proposed mechanisms for graph input and graph output result in significant improvements over this strong baseline, especially with BERT pre-training. Both the novel Transformer and the Graph2Graph Transformer parsers significantly outperform the state-of-the-art in transition-based dependency parsing on both English Penn Treebank, and 11 languages of Universal Dependencies Treebanks. Graph2Graph Transformer can be integrated with many previous structured prediction methods, making it easy to apply to a wide range of NLP tasks.

View on arXiv
Comments on this paper