ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.01793
26
3

Thermodynamic Bayesian Inference

2 October 2024
Maxwell Aifer
Samuel Duffield
Kaelan Donatella
Denis Melanson
Phoebe Klett
Zach Belateche
Gavin Crooks
Antonio J. Martinez
Patrick J. Coles
ArXivPDFHTML
Abstract

A fully Bayesian treatment of complicated predictive models (such as deep neural networks) would enable rigorous uncertainty quantification and the automation of higher-level tasks including model selection. However, the intractability of sampling Bayesian posteriors over many parameters inhibits the use of Bayesian methods where they are most needed. Thermodynamic computing has emerged as a paradigm for accelerating operations used in machine learning, such as matrix inversion, and is based on the mapping of Langevin equations to the dynamics of noisy physical systems. Hence, it is natural to consider the implementation of Langevin sampling algorithms on thermodynamic devices. In this work we propose electronic analog devices that sample from Bayesian posteriors by realizing Langevin dynamics physically. Circuit designs are given for sampling the posterior of a Gaussian-Gaussian model and for Bayesian logistic regression, and are validated by simulations. It is shown, under reasonable assumptions, that the Bayesian posteriors for these models can be sampled in time scaling with ln⁡(d)\ln(d)ln(d), where ddd is dimension. For the Gaussian-Gaussian model, the energy cost is shown to scale with dln⁡(d) d \ln(d)dln(d). These results highlight the potential for fast, energy-efficient Bayesian inference using thermodynamic computing.

View on arXiv
Comments on this paper