ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.04247
35
0

Randomised Postiterations for Calibrated BayesCG

5 April 2025
Niall Vyas
Disha Hegde
Jon Cockayne
ArXivPDFHTML
Abstract

The Bayesian conjugate gradient method offers probabilistic solutions to linear systems but suffers from poor calibration, limiting its utility in uncertainty quantification tasks. Recent approaches leveraging postiterations to construct priors have improved computational properties but failed to correct calibration issues. In this work, we propose a novel randomised postiteration strategy that enhances the calibration of the BayesCG posterior while preserving its favourable convergence characteristics. We present theoretical guarantees for the improved calibration, supported by results on the distribution of posterior errors. Numerical experiments demonstrate the efficacy of the method in both synthetic and inverse problem settings, showing enhanced uncertainty quantification and better propagation of uncertainties through computational pipelines.

View on arXiv
@article{vyas2025_2504.04247,
  title={ Randomised Postiterations for Calibrated BayesCG },
  author={ Niall Vyas and Disha Hegde and Jon Cockayne },
  journal={arXiv preprint arXiv:2504.04247},
  year={ 2025 }
}
Comments on this paper