ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1907.09064
40
0
v1v2v3 (latest)

Progressive Stochastic Greedy Sparse Reconstruction and Support Selection

22 July 2019
Abolfazl Hashemi
H. Vikalo
G. Veciana
ArXiv (abs)PDFHTML
Abstract

Sparse reconstruction and sparse support selection, i.e., the tasks of inferring an arbitrary mmm-dimensional sparse vector x\mathbf{x}x having kkk nonzero entries from nnn measurements of linear combinations of its components, are often encountered in machine learning, computer vision, and signal processing. Existing greedy-based algorithms achieve optimal n=O(klog⁡mk)n = \mathcal{O}(k\log\frac{m}{k})n=O(klogkm​) sampling complexity with computational complexity that is linear in the size of the data mmm and cardinality constraint kkk. However, the O(mk){\mathcal{O}}(mk)O(mk) computational complexity is still prohibitive for large-scale datasets. In this paper, we present the first sparse support selection algorithm for arbitrary sparse vectors that achieves exact identification of the optimal subset from n=O(klog⁡mk)n = \mathcal{O}(k\log\frac{m}{k})n=O(klogkm​) measurements with O~(m)\tilde{\mathcal{O}}(m)O~(m) computational complexity. The proposed scheme utilizes the idea of randomly restricting search space of the greedy method in a progressive manner to reduce the computational cost while maintaining the same order of sampling complexity as the existing greedy schemes. Simulation results including an application of the algorithm to the task of column subset selection demonstrate efficacy of the proposed algorithm.

View on arXiv
Comments on this paper