Neural-g: A Deep Learning Framework for Mixing Density Estimation

Mixing (or prior) density estimation is an important problem in machine learning and statistics, especially in empirical Bayes -modeling where accurately estimating the prior is necessary for making good posterior inferences. In this paper, we propose neural-, a new neural network-based estimator for -modeling. Neural- uses a softmax output layer to ensure that the estimated prior is a valid probability density. Under default hyperparameters, we show that neural- is very flexible and capable of capturing many unknown densities, including those with flat regions, heavy tails, and/or discontinuities. In contrast, existing methods struggle to capture all of these prior shapes. We provide justification for neural- by establishing a new universal approximation theorem regarding the capability of neural networks to learn arbitrary probability mass functions. To accelerate convergence of our numerical implementation, we utilize a weighted average gradient descent approach to update the network parameters. Finally, we extend neural- to multivariate prior density estimation. We illustrate the efficacy of our approach through simulations and analyses of real datasets. A software package to implement neural- is publicly available at https://github.com/shijiew97/neuralG.
View on arXiv