ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.19089
  4. Cited By
Pushdown Layers: Encoding Recursive Structure in Transformer Language
  Models

Pushdown Layers: Encoding Recursive Structure in Transformer Language Models

29 October 2023
Shikhar Murty
Pratyusha Sharma
Jacob Andreas
Christopher D. Manning
    AI4CE
ArXivPDFHTML

Papers citing "Pushdown Layers: Encoding Recursive Structure in Transformer Language Models"

4 / 4 papers shown
Title
Banyan: Improved Representation Learning with Explicit Structure
Banyan: Improved Representation Learning with Explicit Structure
Mattia Opper
N. Siddharth
31
1
0
25 Jul 2024
Inducing Systematicity in Transformers by Attending to Structurally
  Quantized Embeddings
Inducing Systematicity in Transformers by Attending to Structurally Quantized Embeddings
Yichen Jiang
Xiang Zhou
Mohit Bansal
23
1
0
09 Feb 2024
Neural Networks and the Chomsky Hierarchy
Neural Networks and the Chomsky Hierarchy
Grégoire Delétang
Anian Ruoss
Jordi Grau-Moya
Tim Genewein
L. Wenliang
...
Chris Cundy
Marcus Hutter
Shane Legg
Joel Veness
Pedro A. Ortega
UQCV
94
129
0
05 Jul 2022
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
294
6,950
0
20 Apr 2018
1