ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.01457
39
0

Structural Deep Encoding for Table Question Answering

3 March 2025
Raphael Mouravieff
Benjamin Piwowarski
Sylvain Lamprier
    LMTD
ArXivPDFHTML
Abstract

Although Transformers-based architectures excel at processing textual information, their naive adaptation for tabular data often involves flattening the table structure. This simplification can lead to the loss of essential inter-dependencies between rows, columns, and cells, while also posing scalability challenges for large tables. To address these issues, prior works have explored special tokens, structured embeddings, and sparse attention patterns. In this paper, we conduct a comprehensive analysis of tabular encoding techniques, which highlights the crucial role of attention sparsity in preserving structural information of tables. We also introduce a set of novel sparse attention mask designs for tabular data, that not only enhance computational efficiency but also preserve structural integrity, leading to better overall performance.

View on arXiv
@article{mouravieff2025_2503.01457,
  title={ Structural Deep Encoding for Table Question Answering },
  author={ Raphaël Mouravieff and Benjamin Piwowarski and Sylvain Lamprier },
  journal={arXiv preprint arXiv:2503.01457},
  year={ 2025 }
}
Comments on this paper