ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2501.00726
86
0

Enhancing Unsupervised Feature Selection via Double Sparsity Constrained Optimization

3 January 2025
Xianchao Xiu
Anning Yang
Chenyi Huang
Xinrong Li
Wanquan Liu
ArXiv (abs)PDFHTML
Abstract

Unsupervised feature selection (UFS) is widely applied in machine learning and pattern recognition. However, most of the existing methods only consider a single sparsity, which makes it difficult to select valuable and discriminative feature subsets from the original high-dimensional feature set. In this paper, we propose a new UFS method called DSCOFS via embedding double sparsity constrained optimization into the classical principal component analysis (PCA) framework. Double sparsity refers to using ℓ2,0\ell_{2,0}ℓ2,0​-norm and ℓ0\ell_0ℓ0​-norm to simultaneously constrain variables, by adding the sparsity of different types, to achieve the purpose of improving the accuracy of identifying differential features. The core is that ℓ2,0\ell_{2,0}ℓ2,0​-norm can remove irrelevant and redundant features, while ℓ0\ell_0ℓ0​-norm can filter out irregular noisy features, thereby complementing ℓ2,0\ell_{2,0}ℓ2,0​-norm to improve discrimination. An effective proximal alternating minimization method is proposed to solve the resulting nonconvex nonsmooth model. Theoretically, we rigorously prove that the sequence generated by our method globally converges to a stationary point. Numerical experiments on three synthetic datasets and eight real-world datasets demonstrate the effectiveness, stability, and convergence of the proposed method. In particular, the average clustering accuracy (ACC) and normalized mutual information (NMI) are improved by at least 3.34% and 3.02%, respectively, compared with the state-of-the-art methods. More importantly, two common statistical tests and a new feature similarity metric verify the advantages of double sparsity. All results suggest that our proposed DSCOFS provides a new perspective for feature selection.

View on arXiv
Comments on this paper