ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1905.12106
42
38

EM Converges for a Mixture of Many Linear Regressions

28 May 2019
Jeongyeol Kwon
Constantine Caramanis
ArXivPDFHTML
Abstract

We study the convergence of the Expectation-Maximization (EM) algorithm for mixtures of linear regressions with an arbitrary number kkk of components. We show that as long as signal-to-noise ratio (SNR) is Ω~(k)\tilde{\Omega}(k)Ω~(k), well-initialized EM converges to the true regression parameters. Previous results for k≥3k \geq 3k≥3 have only established local convergence for the noiseless setting, i.e., where SNR is infinitely large. Our results enlarge the scope to the environment with noises, and notably, we establish a statistical error rate that is independent of the norm (or pairwise distance) of the regression parameters. In particular, our results imply exact recovery as σ→0\sigma \rightarrow 0σ→0, in contrast to most previous local convergence results for EM, where the statistical error scaled with the norm of parameters. Standard moment-method approaches may be applied to guarantee we are in the region where our local convergence guarantees apply.

View on arXiv
Comments on this paper