ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1908.00449
17
44

Tree-Transformer: A Transformer-Based Method for Correction of Tree-Structured Data

1 August 2019
Jacob A. Harer
Christopher P. Reale
Peter Chin
ArXivPDFHTML
Abstract

Many common sequential data sources, such as source code and natural language, have a natural tree-structured representation. These trees can be generated by fitting a sequence to a grammar, yielding a hierarchical ordering of the tokens in the sequence. This structure encodes a high degree of syntactic information, making it ideal for problems such as grammar correction. However, little work has been done to develop neural networks that can operate on and exploit tree-structured data. In this paper we present the Tree-Transformer \textemdash{} a novel neural network architecture designed to translate between arbitrary input and output trees. We applied this architecture to correction tasks in both the source code and natural language domains. On source code, our model achieved an improvement of 25%25\%25% F0.5\text{F}0.5F0.5 over the best sequential method. On natural language, we achieved comparable results to the most complex state of the art systems, obtaining a 10%10\%10% improvement in recall on the CoNLL 2014 benchmark and the highest to date F0.5\text{F}0.5F0.5 score on the AESW benchmark of 50.4350.4350.43.

View on arXiv
Comments on this paper