ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.01201
  4. Cited By
Were RNNs All We Needed?

Were RNNs All We Needed?

2 October 2024
Leo Feng
Frederick Tung
Mohamed Osama Ahmed
Yoshua Bengio
Hossein Hajimirsadegh
    AI4TS
ArXivPDFHTML

Papers citing "Were RNNs All We Needed?"

7 / 7 papers shown
Title
Bidirectional Linear Recurrent Models for Sequence-Level Multisource Fusion
Bidirectional Linear Recurrent Models for Sequence-Level Multisource Fusion
Qisai Liu
Zhanhong Jiang
Joshua R. Waite
Chao Liu
Aditya Balu
S. Sarkar
AI4TS
24
0
0
11 Apr 2025
Exploring Performance-Complexity Trade-Offs in Sound Event Detection
T. Morocutti
Florian Schmid
Jonathan Greif
Francesco Foscarin
Gerhard Widmer
36
0
0
14 Mar 2025
Transformers without Normalization
Jiachen Zhu
Xinlei Chen
Kaiming He
Yann LeCun
Zhuang Liu
ViT
OffRL
43
6
0
13 Mar 2025
MinGRU-Based Encoder for Turbo Autoencoder Frameworks
Rick Fritschek
Rafael F. Schaefer
61
0
0
11 Mar 2025
From Small to Large Language Models: Revisiting the Federalist Papers
From Small to Large Language Models: Revisiting the Federalist Papers
So Won Jeong
Veronika Rockova
37
0
0
25 Feb 2025
GraphMinNet: Learning Dependencies in Graphs with Light Complexity Minimal Architecture
GraphMinNet: Learning Dependencies in Graphs with Light Complexity Minimal Architecture
Md. Atik Ahamed
Andrew Cheng
Q. Ye
Q. Cheng
GNN
51
0
0
01 Feb 2025
Towards Scalable and Stable Parallelization of Nonlinear RNNs
Towards Scalable and Stable Parallelization of Nonlinear RNNs
Xavier Gonzalez
Andrew Warrington
Jimmy T.H. Smith
Scott W. Linderman
81
8
0
17 Jan 2025
1