A Technical Note on the Architectural Effects on Maximum Dependency
Lengths of Recurrent Neural Networks
Main:3 Pages
4 Figures
Bibliography:1 Pages
1 Tables
Appendix:9 Pages
Abstract
This work proposes a methodology for determining the maximum dependency length of a recurrent neural network (RNN), and then studies the effects of architectural changes, including the number and neuron count of layers, on the maximum dependency lengths of traditional RNN, gated recurrent unit (GRU), and long-short term memory (LSTM) models.
View on arXivComments on this paper
