ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2411.12537
  4. Cited By
Unlocking State-Tracking in Linear RNNs Through Negative Eigenvalues

Unlocking State-Tracking in Linear RNNs Through Negative Eigenvalues

19 November 2024
Riccardo Grazzi
Julien N. Siems
Jörg K.H. Franke
Arber Zela
Frank Hutter
Massimiliano Pontil
ArXivPDFHTML

Papers citing "Unlocking State-Tracking in Linear RNNs Through Negative Eigenvalues"

6 / 6 papers shown
Title
It's All Connected: A Journey Through Test-Time Memorization, Attentional Bias, Retention, and Online Optimization
It's All Connected: A Journey Through Test-Time Memorization, Attentional Bias, Retention, and Online Optimization
Ali Behrouz
Meisam Razaviyayn
Peilin Zhong
Vahab Mirrokni
31
0
0
17 Apr 2025
ParallelFlow: Parallelizing Linear Transformers via Flow Discretization
ParallelFlow: Parallelizing Linear Transformers via Flow Discretization
Nicola Muca Cirone
C. Salvi
38
1
0
01 Apr 2025
Fixed-Point RNNs: From Diagonal to Dense in a Few Iterations
Sajad Movahedi
Felix Sarnthein
Nicola Muca Cirone
Antonio Orvieto
46
2
0
13 Mar 2025
BlackGoose Rimer: Harnessing RWKV-7 as a Simple yet Superior Replacement for Transformers in Large-Scale Time Series Modeling
Li weile
Liu Xiao
43
1
0
08 Mar 2025
Lower Bounds for Chain-of-Thought Reasoning in Hard-Attention Transformers
Lower Bounds for Chain-of-Thought Reasoning in Hard-Attention Transformers
Alireza Amiri
Xinting Huang
Mark Rofin
Michael Hahn
LRM
73
0
0
04 Feb 2025
ARWKV: Pretrain is not what we need, an RNN-Attention-Based Language Model Born from Transformer
Lin Yueyu
Li Zhiyuan
Peter Yue
Liu Xiao
32
5
0
28 Jan 2025
1