ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2312.17306
  4. Cited By
Gradient Flossing: Improving Gradient Descent through Dynamic Control of
  Jacobians

Gradient Flossing: Improving Gradient Descent through Dynamic Control of Jacobians

28 December 2023
Rainer Engelken
ArXivPDFHTML

Papers citing "Gradient Flossing: Improving Gradient Descent through Dynamic Control of Jacobians"

5 / 5 papers shown
Title
Neural ODE Transformers: Analyzing Internal Dynamics and Adaptive Fine-tuning
Neural ODE Transformers: Analyzing Internal Dynamics and Adaptive Fine-tuning
Anh Tong
Thanh Nguyen-Tang
Dongeun Lee
Duc Nguyen
Toan M. Tran
David Hall
Cheongwoong Kang
Jaesik Choi
33
0
0
03 Mar 2025
A scalable generative model for dynamical system reconstruction from
  neuroimaging data
A scalable generative model for dynamical system reconstruction from neuroimaging data
Eric Volkmann
Alena Brändle
Daniel Durstewitz
G. Koppe
AI4CE
28
1
0
05 Nov 2024
Resurrecting Recurrent Neural Networks for Long Sequences
Resurrecting Recurrent Neural Networks for Long Sequences
Antonio Orvieto
Samuel L. Smith
Albert Gu
Anushan Fernando
Çağlar Gülçehre
Razvan Pascanu
Soham De
88
265
0
11 Mar 2023
Input correlations impede suppression of chaos and learning in balanced
  rate networks
Input correlations impede suppression of chaos and learning in balanced rate networks
Rainer Engelken
Alessandro Ingrosso
Ramin Khajeh
Sven Goedeke
L. F. Abbott
16
11
0
24 Jan 2022
On the difficulty of learning chaotic dynamics with RNNs
On the difficulty of learning chaotic dynamics with RNNs
Jonas M. Mikhaeil
Zahra Monfared
Daniel Durstewitz
57
50
0
14 Oct 2021
1