ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1605.07950
  4. Cited By
On Fast Convergence of Proximal Algorithms for SQRT-Lasso Optimization:
  Don't Worry About Its Nonsmooth Loss Function
v1v2v3v4v5v6 (latest)

On Fast Convergence of Proximal Algorithms for SQRT-Lasso Optimization: Don't Worry About Its Nonsmooth Loss Function

25 May 2016
Xingguo Li
Haoming Jiang
Jarvis Haupt
R. Arora
Han Liu
Mingyi Hong
T. Zhao
ArXiv (abs)PDFHTML

Papers citing "On Fast Convergence of Proximal Algorithms for SQRT-Lasso Optimization: Don't Worry About Its Nonsmooth Loss Function"

3 / 3 papers shown
Title
Support recovery and sup-norm convergence rates for sparse pivotal
  estimation
Support recovery and sup-norm convergence rates for sparse pivotal estimation
Mathurin Massias
Quentin Bertrand
Alexandre Gramfort
Joseph Salmon
36
4
0
15 Jan 2020
On Quadratic Convergence of DC Proximal Newton Algorithm for Nonconvex
  Sparse Learning in High Dimensions
On Quadratic Convergence of DC Proximal Newton Algorithm for Nonconvex Sparse Learning in High Dimensions
Xingguo Li
Lin F. Yang
J. Ge
Jarvis Haupt
Tong Zhang
T. Zhao
60
14
0
19 Jun 2017
Efficient Smoothed Concomitant Lasso Estimation for High Dimensional
  Regression
Efficient Smoothed Concomitant Lasso Estimation for High Dimensional Regression
Eugène Ndiaye
Olivier Fercoq
Alexandre Gramfort
V. Leclère
Joseph Salmon
65
28
0
08 Jun 2016
1