ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.15814
  4. Cited By
On Efficiently Representing Regular Languages as RNNs

On Efficiently Representing Regular Languages as RNNs

24 February 2024
Anej Svete
R. Chan
Ryan Cotterell
ArXivPDFHTML

Papers citing "On Efficiently Representing Regular Languages as RNNs"

3 / 3 papers shown
Title
RecurrentGPT: Interactive Generation of (Arbitrarily) Long Text
RecurrentGPT: Interactive Generation of (Arbitrarily) Long Text
Wangchunshu Zhou
Yuchen Eleanor Jiang
Peng Cui
Tiannan Wang
Zhenxin Xiao
Yifan Hou
Ryan Cotterell
Mrinmaya Sachan
RALM
LLMAG
82
58
0
22 May 2023
The Surprising Computational Power of Nondeterministic Stack RNNs
The Surprising Computational Power of Nondeterministic Stack RNNs
Brian DuSell
David Chiang
LRM
23
4
0
04 Oct 2022
Pre-trained Models for Natural Language Processing: A Survey
Pre-trained Models for Natural Language Processing: A Survey
Xipeng Qiu
Tianxiang Sun
Yige Xu
Yunfan Shao
Ning Dai
Xuanjing Huang
LM&MA
VLM
232
1,444
0
18 Mar 2020
1