ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2006.07310
  4. Cited By
Reservoir Computing meets Recurrent Kernels and Structured Transforms
v1v2 (latest)

Reservoir Computing meets Recurrent Kernels and Structured Transforms

Neural Information Processing Systems (NeurIPS), 2020
12 June 2020
Jonathan Dong
Ruben Ohana
M. Rafayelyan
Florent Krzakala
    TPM
ArXiv (abs)PDFHTML

Papers citing "Reservoir Computing meets Recurrent Kernels and Structured Transforms"

9 / 9 papers shown
Revisiting Deep Information Propagation: Fractal Frontier and Finite-size Effects
Revisiting Deep Information Propagation: Fractal Frontier and Finite-size Effects
Giuseppe Alessio DÍnverno
Zhiyuan Hu
Leo Davy
M. Unser
G. Rozza
Jonathan Dong
GNN
109
1
0
05 Aug 2025
Thermodynamic limit in learning period three
Thermodynamic limit in learning period three
Yuichiro Terasaki
Kohei Nakajima
546
4
0
12 May 2024
Neural signature kernels as infinite-width-depth-limits of controlled
  ResNets
Neural signature kernels as infinite-width-depth-limits of controlled ResNetsInternational Conference on Machine Learning (ICML), 2023
Nicola Muca Cirone
M. Lemercier
C. Salvi
343
33
0
30 Mar 2023
Asymptotic Stability in Reservoir Computing
Asymptotic Stability in Reservoir ComputingIEEE International Joint Conference on Neural Network (IJCNN), 2022
Jonathan Dong
Erik Börve
M. Rafayelyan
M. Unser
146
8
0
07 Jun 2022
Beyond accuracy: generalization properties of bio-plausible temporal
  credit assignment rules
Beyond accuracy: generalization properties of bio-plausible temporal credit assignment rulesNeural Information Processing Systems (NeurIPS), 2022
Yuhan Helena Liu
Arna Ghosh
Blake A. Richards
E. Shea-Brown
Guillaume Lajoie
521
10
0
02 Jun 2022
Is the Number of Trainable Parameters All That Actually Matters?
Is the Number of Trainable Parameters All That Actually Matters?
A. Chatelain
Amine Djeghri
Daniel Hesslow
Julien Launay
Iacopo Poli
167
7
0
24 Sep 2021
Unsupervised Reservoir Computing for Solving Ordinary Differential
  Equations
Unsupervised Reservoir Computing for Solving Ordinary Differential Equations
M. Mattheakis
H. Joy
P. Protopapas
199
13
0
25 Aug 2021
A Framework for Machine Learning of Model Error in Dynamical Systems
A Framework for Machine Learning of Model Error in Dynamical SystemsCommunications of the American Mathematical Society (Comm. Amer. Math. Soc.), 2021
Matthew E. Levine
Andrew M. Stuart
326
76
0
14 Jul 2021
SpaRCe: Improved Learning of Reservoir Computing Systems through Sparse
  Representations
SpaRCe: Improved Learning of Reservoir Computing Systems through Sparse RepresentationsIEEE Transactions on Neural Networks and Learning Systems (TNNLS), 2019
Luca Manneschi
Andrew C. Lin
Eleni Vasilaki
348
26
0
04 Dec 2019
1
Page 1 of 1