ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1503.07027
23
39

Convergence radius and sample complexity of ITKM algorithms for dictionary learning

24 March 2015
Karin Schnass
ArXivPDFHTML
Abstract

In this work we show that iterative thresholding and K-means (ITKM) algorithms can recover a generating dictionary with K atoms from noisy SSS sparse signals up to an error ε~\tilde \varepsilonε~ as long as the initialisation is within a convergence radius, that is up to a log⁡K\log KlogK factor inversely proportional to the dynamic range of the signals, and the sample size is proportional to Klog⁡Kε~−2K \log K \tilde \varepsilon^{-2}KlogKε~−2. The results are valid for arbitrary target errors if the sparsity level is of the order of the square root of the signal dimension ddd and for target errors down to K−ℓK^{-\ell}K−ℓ if SSS scales as S≤d/(ℓlog⁡K)S \leq d/(\ell \log K)S≤d/(ℓlogK).

View on arXiv
Comments on this paper