ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1706.02761
  4. Cited By
Gated Orthogonal Recurrent Units: On Learning to Forget

Gated Orthogonal Recurrent Units: On Learning to Forget

8 June 2017
Li Jing
Çağlar Gülçehre
J. Peurifoy
Yichen Shen
Max Tegmark
Marin Soljacic
Yoshua Bengio
ArXivPDFHTML

Papers citing "Gated Orthogonal Recurrent Units: On Learning to Forget"

5 / 5 papers shown
Title
Compact Recurrent Transformer with Persistent Memory
Compact Recurrent Transformer with Persistent Memory
Edison Mucllari
Z. Daniels
David C. Zhang
Qiang Ye
CLL
VLM
46
0
0
02 May 2025
Recurrent Quantum Neural Networks
Recurrent Quantum Neural Networks
Johannes Bausch
21
148
0
25 Jun 2020
Towards Non-saturating Recurrent Units for Modelling Long-term
  Dependencies
Towards Non-saturating Recurrent Units for Modelling Long-term Dependencies
A. Chandar
Chinnadhurai Sankar
Eugene Vorontsov
Samira Ebrahimi Kahou
Yoshua Bengio
11
56
0
22 Jan 2019
Complex Gated Recurrent Neural Networks
Complex Gated Recurrent Neural Networks
Moritz Wolter
Angela Yao
AI4CE
6
63
0
21 Jun 2018
Learning Unitary Operators with Help From u(n)
Learning Unitary Operators with Help From u(n)
Stephanie L. Hyland
Gunnar Rätsch
83
42
0
17 Jul 2016
1