ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1911.09011
  4. Cited By
Bayesian interpretation of SGD as Ito process

Bayesian interpretation of SGD as Ito process

20 November 2019
Soma Yokoi
Issei Sato
ArXiv (abs)PDFHTML

Papers citing "Bayesian interpretation of SGD as Ito process"

4 / 4 papers shown
Title
Do Parameters Reveal More than Loss for Membership Inference?
Do Parameters Reveal More than Loss for Membership Inference?
Anshuman Suri
Xiao Zhang
David Evans
MIACVMIALMAAML
85
1
0
17 Jun 2024
Variational Stochastic Gradient Descent for Deep Neural Networks
Variational Stochastic Gradient Descent for Deep Neural Networks
Haotian Chen
Anna Kuzina
Babak Esmaeili
Jakub M. Tomczak
104
0
0
09 Apr 2024
Hamiltonian Monte Carlo Particle Swarm Optimizer
Hamiltonian Monte Carlo Particle Swarm Optimizer
Omatharv Bharat Vaidya
Rithvik Terence DSouza
Snehanshu Saha
S. Dhavala
Swagatam Das
83
0
0
08 May 2022
AdaSwarm: Augmenting Gradient-Based optimizers in Deep Learning with
  Swarm Intelligence
AdaSwarm: Augmenting Gradient-Based optimizers in Deep Learning with Swarm Intelligence
Rohan Mohapatra
Snehanshu Saha
C. Coello
Anwesh Bhattacharya
S. Dhavala
S. Saha
ODL
68
21
0
19 May 2020
1