517

Fisher Efficient Inference of Intractable Models

Abstract

Maximum Likelihood Estimator (MLE) has many good properties. For example, the asymptotic variance of its solution attains equality of the asymptotic Cram{\'e}r-Rao lower bound (efficiency bound), which is the minimum possible variance for an unbiased estimator. However, obtaining such MLE solution requires calculating the likelihood function which may not be tractable due to the normalization term of the density model. In this paper, we derive a Discriminative Likelihood Estimator (DLE) from the Kullback-Leibler divergence minimization criterion implemented via density ratio estimation procedure and Stein operator. We study the problem of model inference using DLE and particularly we prove the asymptotic variance of its solution can also attain the equality of the efficiency bound under mild regularity conditions. Numerical studies validate our asymptotic theorems and show DLE can indeed perform well under various settings.

View on arXiv
Comments on this paper