ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.09318
89
3

SigGate: Enhancing Recurrent Neural Networks with Signature-Based Gating Mechanisms

13 February 2025
Remi Genet
Hugo Inzirillo
ArXivPDFHTML
Abstract

In this paper, we propose a novel approach that enhances recurrent neural networks (RNNs) by incorporating path signatures into their gating mechanisms. Our method modifies both Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) architectures by replacing their forget and reset gates, respectively, with learnable path signatures. These signatures, which capture the geometric features of the entire path history, provide a richer context for controlling information flow through the network's memory. This modification allows the networks to make memory decisions based on the full historical context rather than just the current input and state. Through experimental studies, we demonstrate that our Signature-LSTM (SigLSTM) and Signature-GRU (SigGRU) models outperform their traditional counterparts across various sequential learning tasks. By leveraging path signatures in recurrent architectures, this method offers new opportunities to enhance performance in time series analysis and forecasting applications.

View on arXiv
@article{genet2025_2502.09318,
  title={ SigGate: Enhancing Recurrent Neural Networks with Signature-Based Gating Mechanisms },
  author={ Rémi Genet and Hugo Inzirillo },
  journal={arXiv preprint arXiv:2502.09318},
  year={ 2025 }
}
Comments on this paper