ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.05271
46
0

T-T: Table Transformer for Tagging-based Aspect Sentiment Triplet Extraction

8 May 2025
Kun Peng
Chaodong Tong
Cong Cao
Hao Peng
Q. Li
Guanlin Wu
Lei Jiang
Yanbing Liu
Philip S. Yu
    LMTD
ArXivPDFHTML
Abstract

Aspect sentiment triplet extraction (ASTE) aims to extract triplets composed of aspect terms, opinion terms, and sentiment polarities from given sentences. The table tagging method is a popular approach to addressing this task, which encodes a sentence into a 2-dimensional table, allowing for the tagging of relations between any two words. Previous efforts have focused on designing various downstream relation learning modules to better capture interactions between tokens in the table, revealing that a stronger capability to capture relations can lead to greater improvements in the model. Motivated by this, we attempt to directly utilize transformer layers as downstream relation learning modules. Due to the powerful semantic modeling capability of transformers, it is foreseeable that this will lead to excellent improvement. However, owing to the quadratic relation between the length of the table and the length of the input sentence sequence, using transformers directly faces two challenges: overly long table sequences and unfair local attention interaction. To address these challenges, we propose a novel Table-Transformer (T-T) for the tagging-based ASTE method. Specifically, we introduce a stripe attention mechanism with a loop-shift strategy to tackle these challenges. The former modifies the global attention mechanism to only attend to a 2-dimensional local attention window, while the latter facilitates interaction between different attention windows. Extensive and comprehensive experiments demonstrate that the T-T, as a downstream relation learning module, achieves state-of-the-art performance with lower computational costs.

View on arXiv
@article{peng2025_2505.05271,
  title={ T-T: Table Transformer for Tagging-based Aspect Sentiment Triplet Extraction },
  author={ Kun Peng and Chaodong Tong and Cong Cao and Hao Peng and Qian Li and Guanlin Wu and Lei Jiang and Yanbing Liu and Philip S. Yu },
  journal={arXiv preprint arXiv:2505.05271},
  year={ 2025 }
}
Comments on this paper