ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.08967
  4. Cited By
Gradient-Free Training of Recurrent Neural Networks using Random
  Perturbations

Gradient-Free Training of Recurrent Neural Networks using Random Perturbations

14 May 2024
Jesus Garcia Fernandez
Sander Keemink
Marcel van Gerven
    AAML
ArXivPDFHTML

Papers citing "Gradient-Free Training of Recurrent Neural Networks using Random Perturbations"

3 / 3 papers shown
Title
Efficient Deep Learning with Decorrelated Backpropagation
Efficient Deep Learning with Decorrelated Backpropagation
Sander Dalm
Joshua Offergeld
Nasir Ahmad
Marcel van Gerven
47
4
0
03 May 2024
Resurrecting Recurrent Neural Networks for Long Sequences
Resurrecting Recurrent Neural Networks for Long Sequences
Antonio Orvieto
Samuel L. Smith
Albert Gu
Anushan Fernando
Çağlar Gülçehre
Razvan Pascanu
Soham De
88
265
0
11 Mar 2023
Informer: Beyond Efficient Transformer for Long Sequence Time-Series
  Forecasting
Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting
Haoyi Zhou
Shanghang Zhang
J. Peng
Shuai Zhang
Jianxin Li
Hui Xiong
Wan Zhang
AI4TS
167
3,855
0
14 Dec 2020
1