ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1609.07532
27
34

Well-posed Bayesian Inverse Problems with Infinitely-Divisible and Heavy-Tailed Prior Measures

23 September 2016
Bamdad Hosseini
ArXivPDFHTML
Abstract

We present a new class of prior measures in connection to ℓp\ell_pℓp​ regularization techniques when p∈(0,1)p \in(0,1)p∈(0,1) which is based on the generalized Gamma distribution. We show that the resulting prior measure is heavy-tailed, non-convex and infinitely divisible. Motivated by this observation we discuss the class of infinitely divisible prior measures and draw a connection between their tail behavior and the tail behavior of their L{\évy} measures. Next, we use the laws of pure jump L{\é}vy processes in order to define new classes of prior measures that are concentrated on the space of functions with bounded variation. These priors serve as an alternative to the classic total variation prior and result in well-defined inverse problems. We then study the well-posedness of Bayesian inverse problems in a general enough setting that encompasses the above mentioned classes of prior measures. We establish that well-posedness relies on a balance between the growth of the log-likelihood function and the tail behavior of the prior and apply our results to special cases such as additive noise models and linear problems. Finally, we discuss some of the practical aspects of Bayesian inverse problems such as their consistent approximation and present three concrete examples of well-posed Bayesian inverse problems with heavy-tailed or stochastic process prior measures.

View on arXiv
Comments on this paper