ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.05173
16
1

Bayesian Inference for kkk-Monotone Densities with Applications to Multiple Testing

8 June 2023
Kang-Kang Wang
S. Ghosal
ArXivPDFHTML
Abstract

Shape restriction, like monotonicity or convexity, imposed on a function of interest, such as a regression or density function, allows for its estimation without smoothness assumptions. The concept of kkk-monotonicity encompasses a family of shape restrictions, including decreasing and convex decreasing as special cases corresponding to k=1k=1k=1 and k=2k=2k=2. We consider Bayesian approaches to estimate a kkk-monotone density. By utilizing a kernel mixture representation and putting a Dirichlet process or a finite mixture prior on the mixing distribution, we show that the posterior contraction rate in the Hellinger distance is (n/log⁡n)−k/(2k+1)(n/\log n)^{- k/(2k + 1)}(n/logn)−k/(2k+1) for a kkk-monotone density, which is minimax optimal up to a polylogarithmic factor. When the true kkk-monotone density is a finite J0J_0J0​-component mixture of the kernel, the contraction rate improves to the nearly parametric rate (J0log⁡n)/n\sqrt{(J_0 \log n)/n}(J0​logn)/n​. Moreover, by putting a prior on kkk, we show that the same rates hold even when the best value of kkk is unknown. A specific application in modeling the density of ppp-values in a large-scale multiple testing problem is considered. Simulation studies are conducted to evaluate the performance of the proposed method.

View on arXiv
Comments on this paper