ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1706.01428
59
26
v1v2v3v4v5 (latest)

A correspondence between thermodynamics and inference

5 June 2017
Colin H. LaMont
Paul A. Wiggins
ArXiv (abs)PDFHTML
Abstract

We systematically explore a natural analogy between Bayesian statistics and thermal physics in which sample size corresponds to inverse temperature. We discover that some canonical thermodynamic quantities already correspond to well-established statistical quantities. Motivated by physical insight into thermal physics, we define two novel statistical quantities: a learning capacity and Gibbs entropy. The definition of the learning capacity leads to a critical insight: The well-known mechanism of failure of the equipartition theorem in statistical mechanics is the mechanism for anomalously-predictive or sloppy models in statistics. This correspondence between the learning and heat capacities provides new insight into the mechanism of machine learning. The correspondence also suggests a solution to a long-standing difficulty in Bayesian statistics: the definition of an objective prior. We propose that the Gibbs entropy provides a natural generalization of the principle of indifference that defines objectivity. This approach unifies the disparate Bayesian, frequentist and information-based paradigms of statistics by achieving coherent inference between these competing formulations.

View on arXiv
Comments on this paper