28
0

Neural-g: A Deep Learning Framework for Mixing Density Estimation

Abstract

Mixing (or prior) density estimation is an important problem in machine learning and statistics, especially in empirical Bayes gg-modeling where accurately estimating the prior is necessary for making good posterior inferences. In this paper, we propose neural-gg, a new neural network-based estimator for gg-modeling. Neural-gg uses a softmax output layer to ensure that the estimated prior is a valid probability density. Under default hyperparameters, we show that neural-gg is very flexible and capable of capturing many unknown densities, including those with flat regions, heavy tails, and/or discontinuities. In contrast, existing methods struggle to capture all of these prior shapes. We provide justification for neural-gg by establishing a new universal approximation theorem regarding the capability of neural networks to learn arbitrary probability mass functions. To accelerate convergence of our numerical implementation, we utilize a weighted average gradient descent approach to update the network parameters. Finally, we extend neural-gg to multivariate prior density estimation. We illustrate the efficacy of our approach through simulations and analyses of real datasets. A software package to implement neural-gg is publicly available at https://github.com/shijiew97/neuralG.

View on arXiv
Comments on this paper