On approximating the -divergence between two Ising models
The -divergence is a fundamental notion that measures the difference between two distributions. In this paper, we study the problem of approximating the -divergence between two Ising models, which is a generalization of recent work on approximating the TV-distance. Given two Ising models and , which are specified by their interaction matrices and external fields, the problem is to approximate the -divergence within an arbitrary relative error . For -divergence with a constant integer , we establish both algorithmic and hardness results. The algorithm works in a parameter regime that matches the hardness result. Our algorithm can be extended to other -divergences such as -divergence, Kullback-Leibler divergence, Rényi divergence, Jensen-Shannon divergence, and squared Hellinger distance.
View on arXiv