ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1605.01656
80
93
v1v2v3 (latest)

A Tight Bound of Hard Thresholding

5 May 2016
Jie Shen
Ping Li
ArXiv (abs)PDFHTML
Abstract

This paper is concerned with the hard thresholding technique which sets all but the kkk largest absolute elements to zero. We establish a tight bound that quantitatively characterizes the deviation of the thresholded solution from a given signal. Our theoretical result is universal in the sense that it holds for all choices of parameters, and the underlying analysis only depends on fundamental arguments in mathematical optimization. We discuss the implications for the literature: Compressed Sensing. On account of the crucial estimate, we bridge the connection between restricted isometry property (RIP) and the sparsity parameter of kkk for a vast volume of hard thresholding based algorithms, which renders an improvement on the RIP condition especially when the true sparsity is unknown. This suggests that in essence, many more kinds of sensing matrices or fewer measurements are admissible for the data acquisition procedure. Machine Learning. In terms of large-scale machine learning, a significant yet challenging problem is producing sparse solutions in online setting. In stark contrast to prior works that attempted the ℓ1\ell_1ℓ1​ relaxation for promoting sparsity, we present a novel algorithm which performs hard thresholding in each iteration to ensure such parsimonious solutions. Equipped with the developed bound for hard thresholding, we prove global linear convergence for a number of prevalent statistical models under mild assumptions, even though the problem turns out to be non-convex.

View on arXiv
Comments on this paper