56
4

Information Thresholds for Non-Parametric Structure Learning on Tree Graphical Models

Abstract

We provide high probability finite sample complexity guarantees for non-parametric structure learning of tree-shaped graphical models whose nodes are discrete random variables with either finite or countable alphabets, both in the noiseless and noisy regimes. We study a fundamental quantity called the (noisy) information threshold, which arises naturally from the error analysis of the Chow-Liu algorithm and, as we discuss, provides explicit necessary and sufficient conditions on sample complexity, by effectively summarizing the difficulty of the tree-structure learning problem. Specifically, we show that finite sample complexity of the Chow-Liu algorithm for ensuring exact structure recovery is inversely proportional to the information threshold (provided it is positive), and scales almost logarithmically relative to the number of nodes over a given probability of failure, also matching relevant asymptotic results in the literature. Conversely, in the noiseless case, we show that, for arbitrarily small information thresholds, the structure recovery task given any finite number of samples becomes impossible for any algorithm whatsoever. Consequently, strict positivity of the information threshold characterizes the feasibility of tree-structure learning, in general terms. Lastly, as a consequence of our analysis, we resolve the problem of tree structure learning in the presence of non-identically distributed observation noise, providing conditions for convergence of the Chow-Liu algorithm under this setting, as well.

View on arXiv
Comments on this paper