8
5

Simple Alternating Minimization Provably Solves Complete Dictionary Learning

Abstract

This paper focuses on the noiseless complete dictionary learning problem, where the goal is to represent a set of given signals as linear combinations of a small number of atoms from a learned dictionary. There are two main challenges faced by theoretical and practical studies of dictionary learning: the lack of theoretical guarantees for practically-used heuristic algorithms and their poor scalability when dealing with huge-scale datasets. Towards addressing these issues, we propose a simple and efficient algorithm that provably recovers the ground truth when applied to the nonconvex and discrete formulation of the problem in the noiseless setting. We also extend our proposed method to mini-batch and online settings where the data is huge-scale or arrives continuously over time. At the core of our proposed method lies an efficient preconditioning technique that transforms the unknown dictionary to a near-orthonormal one, for which we prove a simple alternating minimization technique converges linearly to the ground truth under minimal conditions. Our numerical experiments on synthetic and real datasets showcase the superiority of our method compared with the existing techniques.

View on arXiv
@article{liang2025_2210.12816,
  title={ Simple Alternating Minimization Provably Solves Complete Dictionary Learning },
  author={ Geyu Liang and Gavin Zhang and Salar Fattahi and Richard Y. Zhang },
  journal={arXiv preprint arXiv:2210.12816},
  year={ 2025 }
}
Comments on this paper