Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1706.02761
Cited By
Gated Orthogonal Recurrent Units: On Learning to Forget
8 June 2017
Li Jing
Çağlar Gülçehre
J. Peurifoy
Yichen Shen
Max Tegmark
Marin Soljacic
Yoshua Bengio
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Gated Orthogonal Recurrent Units: On Learning to Forget"
7 / 7 papers shown
Title
Compact Recurrent Transformer with Persistent Memory
Edison Mucllari
Z. Daniels
David C. Zhang
Qiang Ye
CLL
VLM
46
0
0
02 May 2025
Recurrent Quantum Neural Networks
Johannes Bausch
21
148
0
25 Jun 2020
Towards Non-saturating Recurrent Units for Modelling Long-term Dependencies
A. Chandar
Chinnadhurai Sankar
Eugene Vorontsov
Samira Ebrahimi Kahou
Yoshua Bengio
13
56
0
22 Jan 2019
Complex Gated Recurrent Neural Networks
Moritz Wolter
Angela Yao
AI4CE
8
63
0
21 Jun 2018
Orthogonal Recurrent Neural Networks with Scaled Cayley Transform
Kyle E. Helfrich
Devin Willmott
Q. Ye
26
127
0
29 Jul 2017
On orthogonality and learning recurrent networks with long term dependencies
Eugene Vorontsov
C. Trabelsi
Samuel Kadoury
C. Pal
ODL
23
237
0
31 Jan 2017
Learning Unitary Operators with Help From u(n)
Stephanie L. Hyland
Gunnar Rätsch
83
42
0
17 Jul 2016
1