ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.00112
  4. Cited By
SAGRAD: A Program for Neural Network Training with Simulated Annealing and the Conjugate Gradient Method

SAGRAD: A Program for Neural Network Training with Simulated Annealing and the Conjugate Gradient Method

Journal of Research of the National Institute of Standards and Technology (J. Res. Natl. Inst. Stand. Technol.), 2015
31 January 2025
Javier Bernal
Jose Torres-Jimenez
ArXiv (abs)PDFHTML

Papers citing "SAGRAD: A Program for Neural Network Training with Simulated Annealing and the Conjugate Gradient Method"

2 / 2 papers shown
Title
RoseNNa: A performant, portable library for neural network inference
  with application to computational fluid dynamics
RoseNNa: A performant, portable library for neural network inference with application to computational fluid dynamicsComputer Physics Communications (CPC), 2023
Ajay Bati
S. Bryngelson
AI4CE
95
1
0
30 Jul 2023
A Fortran-Keras Deep Learning Bridge for Scientific Computing
A Fortran-Keras Deep Learning Bridge for Scientific ComputingScientific Programming (SP), 2020
J. Ott
M. Pritchard
Natalie Best
Erik J. Linstead
M. Curcic
Pierre Baldi
AI4CE
143
104
0
14 Apr 2020
1