325
v1v2 (latest)

Gated Recurrent Neural Networks with Weighted Time-Delay Feedback

International Conference on Artificial Intelligence and Statistics (AISTATS), 2022
Main:9 Pages
14 Figures
Bibliography:4 Pages
14 Tables
Appendix:14 Pages
Abstract

In this paper, we present a novel approach to modeling long-term dependencies in sequential data by introducing a gated recurrent unit (GRU) with a weighted time-delay feedback mechanism. Our proposed model, named τ\tau-GRU, is a discretized version of a continuous-time formulation of a recurrent unit, where the dynamics are governed by delay differential equations (DDEs). We prove the existence and uniqueness of solutions for the continuous-time model and show that the proposed feedback mechanism can significantly improve the modeling of long-term dependencies. Our empirical results indicate that τ\tau-GRU outperforms state-of-the-art recurrent units and gated recurrent architectures on a range of tasks, achieving faster convergence and better generalization.

View on arXiv
Comments on this paper