ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1611.00035
  4. Cited By
Full-Capacity Unitary Recurrent Neural Networks

Full-Capacity Unitary Recurrent Neural Networks

31 October 2016
Scott Wisdom
Thomas Powers
J. Hershey
Jonathan Le Roux
L. Atlas
ArXiv (abs)PDFHTML

Papers citing "Full-Capacity Unitary Recurrent Neural Networks"

5 / 105 papers shown
Title
Tunable Efficient Unitary Neural Networks (EUNN) and their application
  to RNNs
Tunable Efficient Unitary Neural Networks (EUNN) and their application to RNNs
Li Jing
Yichen Shen
T. Dubček
J. Peurifoy
S. Skirlo
Yann LeCun
Max Tegmark
Marin Soljacic
95
178
0
15 Dec 2016
DizzyRNN: Reparameterizing Recurrent Neural Networks for Norm-Preserving
  Backpropagation
DizzyRNN: Reparameterizing Recurrent Neural Networks for Norm-Preserving Backpropagation
Victor D. Dorobantu
Per Andre Stromhaug
Jess Renteria
86
25
0
13 Dec 2016
Efficient Orthogonal Parametrisation of Recurrent Neural Networks Using
  Householder Reflections
Efficient Orthogonal Parametrisation of Recurrent Neural Networks Using Householder Reflections
Zakaria Mhammedi
Andrew D. Hellicar
Ashfaqur Rahman
James Bailey
101
129
0
01 Dec 2016
Improving Variational Auto-Encoders using Householder Flow
Improving Variational Auto-Encoders using Householder Flow
Jakub M. Tomczak
Max Welling
BDLDRL
115
175
0
29 Nov 2016
Interpretable Recurrent Neural Networks Using Sequential Sparse Recovery
Interpretable Recurrent Neural Networks Using Sequential Sparse Recovery
Scott Wisdom
Thomas Powers
J. Pitton
L. Atlas
95
36
0
22 Nov 2016
Previous
123