42
4

An Incidence Geometry approach to Dictionary Learning

Abstract

We study Dictionary Learning (aka sparse coding). By geometrically interpreting an exact formulation of Dictionary Learning, we identify related problems and draw formal relationships among them. Dictionary Learning is viewed as the minimum generating set of a subspace arrangement. This formulation leads to a new family of dictionary learning algorithms. When the data are sufficiently general and the dictionary size is sufficiently large, we completely characterize the combinatorics of the associated subspace arrangements (i.e, their underlying hypergraphs). This characterization is obtained using algebraic and combinatorial geometry. Specifically, a combinatorial rigidity-type theorem is proven that characterizes the hypergraphs of subspace arrangements that generically yield (a) at least one dictionary (b) a locally unique dictionary (i.e., at most a finite number of isolated dictionaries) of the specified size. We are unaware of prior application of combinatorial rigidity techniques in the setting of Dictionary Learning, or even in machine learning. We list directions for further research that this approach opens up.

View on arXiv
Comments on this paper