Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1605.07950
Cited By
v1
v2
v3
v4
v5
v6 (latest)
On Fast Convergence of Proximal Algorithms for SQRT-Lasso Optimization: Don't Worry About Its Nonsmooth Loss Function
25 May 2016
Xingguo Li
Haoming Jiang
Jarvis Haupt
R. Arora
Han Liu
Mingyi Hong
T. Zhao
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"On Fast Convergence of Proximal Algorithms for SQRT-Lasso Optimization: Don't Worry About Its Nonsmooth Loss Function"
3 / 3 papers shown
Title
Support recovery and sup-norm convergence rates for sparse pivotal estimation
Mathurin Massias
Quentin Bertrand
Alexandre Gramfort
Joseph Salmon
36
4
0
15 Jan 2020
On Quadratic Convergence of DC Proximal Newton Algorithm for Nonconvex Sparse Learning in High Dimensions
Xingguo Li
Lin F. Yang
J. Ge
Jarvis Haupt
Tong Zhang
T. Zhao
55
14
0
19 Jun 2017
Efficient Smoothed Concomitant Lasso Estimation for High Dimensional Regression
Eugène Ndiaye
Olivier Fercoq
Alexandre Gramfort
V. Leclère
Joseph Salmon
65
28
0
08 Jun 2016
1