ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2009.11264
  4. Cited By
On the Ability and Limitations of Transformers to Recognize Formal
  Languages
v1v2 (latest)

On the Ability and Limitations of Transformers to Recognize Formal Languages

Conference on Empirical Methods in Natural Language Processing (EMNLP), 2020
23 September 2020
S. Bhattamishra
Kabir Ahuja
Navin Goyal
ArXiv (abs)PDFHTML

Papers citing "On the Ability and Limitations of Transformers to Recognize Formal Languages"

5 / 5 papers shown
On the Capacity of Self-Attention
On the Capacity of Self-Attention
Micah Adler
193
0
0
26 Sep 2025
Stability Analysis of Various Symbolic Rule Extraction Methods from
  Recurrent Neural Network
Stability Analysis of Various Symbolic Rule Extraction Methods from Recurrent Neural Network
Neisarg Dave
Daniel Kifer
C. Lee Giles
A. Mali
155
3
0
04 Feb 2024
On the Expressive Power of Self-Attention Matrices
On the Expressive Power of Self-Attention Matrices
Valerii Likhosherstov
K. Choromanski
Adrian Weller
347
43
0
07 Jun 2021
Self-Attention Networks Can Process Bounded Hierarchical Languages
Self-Attention Networks Can Process Bounded Hierarchical LanguagesAnnual Meeting of the Association for Computational Linguistics (ACL), 2021
Shunyu Yao
Binghui Peng
Christos H. Papadimitriou
Karthik Narasimhan
328
98
0
24 May 2021
Formal Language Theory Meets Modern NLP
Formal Language Theory Meets Modern NLP
William Merrill
AI4CENAI
375
17
0
19 Feb 2021
1