ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.06840
12
0

A Statistical Framework for Model Selection in LSTM Networks

7 June 2025
Fahad Mostafa
ArXiv (abs)PDFHTML
Main:18 Pages
6 Figures
Bibliography:3 Pages
6 Tables
Abstract

Long Short-Term Memory (LSTM) neural network models have become the cornerstone for sequential data modeling in numerous applications, ranging from natural language processing to time series forecasting. Despite their success, the problem of model selection, including hyperparameter tuning, architecture specification, and regularization choice remains largely heuristic and computationally expensive. In this paper, we propose a unified statistical framework for systematic model selection in LSTM networks. Our framework extends classical model selection ideas, such as information criteria and shrinkage estimation, to sequential neural networks. We define penalized likelihoods adapted to temporal structures, propose a generalized threshold approach for hidden state dynamics, and provide efficient estimation strategies using variational Bayes and approximate marginal likelihood methods. Several biomedical data centric examples demonstrate the flexibility and improved performance of the proposed framework.

View on arXiv
@article{mostafa2025_2506.06840,
  title={ A Statistical Framework for Model Selection in LSTM Networks },
  author={ Fahad Mostafa },
  journal={arXiv preprint arXiv:2506.06840},
  year={ 2025 }
}
Comments on this paper