ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2112.02524
26
1

Laplace Power-expected-posterior priors for generalized linear models with applications to logistic regression

5 December 2021
Anupreet Porwal
Abel Rodríguez
ArXiv (abs)PDFHTML
Abstract

Power-expected-posterior (PEP) methodology, which borrows ideas from the literature on power priors, expected-posterior priors and unit information priors, provides a systematic way to construct objective priors. The basic idea is to use imaginary training samples to update a noninformative prior into a minimally-informative prior. In this work, we develop a novel definition of PEP priors for generalized linear models that relies on a Laplace expansion of the likelihood of the imaginary training sample. This approach has various computational, practical and theoretical advantages over previous proposals for non-informative priors for generalized linear models. We place a special emphasis on logistic regression models, where sample separation presents particular challenges to alternative methodologies. We investigate both asymptotic and finite-sample properties of the procedures, showing that is both asymptotic and intrinsic consistent, and that its performance is at least competitive and, in some settings, superior to that of alternative approaches in the literature.

View on arXiv
Comments on this paper