ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1707.03505
  4. Cited By
Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex
  Problems
v1v2v3v4v5 (latest)

Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems

12 July 2017
Damek Davis
Benjamin Grimmer
ArXiv (abs)PDFHTML

Papers citing "Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems"

27 / 27 papers shown
Title
Stochastic Momentum Methods for Non-smooth Non-Convex Finite-Sum Coupled Compositional Optimization
Stochastic Momentum Methods for Non-smooth Non-Convex Finite-Sum Coupled Compositional Optimization
Xingyu Chen
Bokun Wang
Ming-Hsuan Yang
Quanqi Hu
Qihang Lin
Tianbao Yang
54
0
0
03 Jun 2025
Stochastic Primal-Dual Double Block-Coordinate for Two-way Partial AUC Maximization
Stochastic Primal-Dual Double Block-Coordinate for Two-way Partial AUC Maximization
Linli Zhou
Bokun Wang
My T. Thai
Tianbao Yang
20
0
0
28 May 2025
Some Primal-Dual Theory for Subgradient Methods for Strongly Convex Optimization
Some Primal-Dual Theory for Subgradient Methods for Strongly Convex Optimization
Benjamin Grimmer
Danlin Li
128
6
0
31 Dec 2024
Towards Certified Unlearning for Deep Neural Networks
Towards Certified Unlearning for Deep Neural Networks
Binchi Zhang
Yushun Dong
Tianhao Wang
Wenlin Yao
MU
165
13
0
01 Aug 2024
Coordinate Descent Methods for Fractional Minimization
Coordinate Descent Methods for Fractional Minimization
Ganzhao Yuan
78
6
0
30 Jan 2022
Learning Proximal Operators to Discover Multiple Optima
Learning Proximal Operators to Discover Multiple Optima
Lingxiao Li
Noam Aigerman
Vladimir G. Kim
Jiajin Li
Kristjan Greenewald
Mikhail Yurochkin
Justin Solomon
94
1
0
28 Jan 2022
Coordinate Descent Methods for DC Minimization: Optimality Conditions
  and Global Convergence
Coordinate Descent Methods for DC Minimization: Optimality Conditions and Global Convergence
Ganzhao Yuan
76
3
0
09 Sep 2021
Stability and Convergence of Stochastic Gradient Clipping: Beyond
  Lipschitz Continuity and Smoothness
Stability and Convergence of Stochastic Gradient Clipping: Beyond Lipschitz Continuity and Smoothness
Vien V. Mai
M. Johansson
90
40
0
12 Feb 2021
Practical Precoding via Asynchronous Stochastic Successive Convex
  Approximation
Practical Precoding via Asynchronous Stochastic Successive Convex Approximation
Basil M. Idrees
J. Akhtar
K. Rajawat
54
6
0
03 Oct 2020
Approximation Benefits of Policy Gradient Methods with Aggregated States
Approximation Benefits of Policy Gradient Methods with Aggregated States
Daniel Russo
123
7
0
22 Jul 2020
The Landscape of the Proximal Point Method for Nonconvex-Nonconcave
  Minimax Optimization
The Landscape of the Proximal Point Method for Nonconvex-Nonconcave Minimax Optimization
Benjamin Grimmer
Haihao Lu
Pratik Worah
Vahab Mirrokni
116
9
0
15 Jun 2020
Convergence of adaptive algorithms for weakly convex constrained
  optimization
Convergence of adaptive algorithms for weakly convex constrained optimization
Ahmet Alacaoglu
Yura Malitsky
Volkan Cevher
61
10
0
11 Jun 2020
Adaptive First-and Zeroth-order Methods for Weakly Convex Stochastic
  Optimization Problems
Adaptive First-and Zeroth-order Methods for Weakly Convex Stochastic Optimization Problems
Parvin Nazari
Davoud Ataee Tarzanagh
George Michailidis
ODL
76
14
0
19 May 2020
Revisiting SGD with Increasingly Weighted Averaging: Optimization and
  Generalization Perspectives
Revisiting SGD with Increasingly Weighted Averaging: Optimization and Generalization Perspectives
Zhishuai Guo
Yan Yan
Tianbao Yang
MoMe
75
4
0
09 Mar 2020
Stochastic Gauss-Newton Algorithms for Nonconvex Compositional
  Optimization
Stochastic Gauss-Newton Algorithms for Nonconvex Compositional Optimization
Quoc Tran-Dinh
Nhan H. Pham
Lam M. Nguyen
68
24
0
17 Feb 2020
Efficiency of Coordinate Descent Methods For Structured Nonconvex
  Optimization
Efficiency of Coordinate Descent Methods For Structured Nonconvex Optimization
Qi Deng
Chen Lan
36
8
0
03 Sep 2019
Stochastic Optimization for Non-convex Inf-Projection Problems
Stochastic Optimization for Non-convex Inf-Projection Problems
Yan Yan
Yi Tian Xu
Lijun Zhang
Xiaoyu Wang
Tianbao Yang
28
3
0
26 Aug 2019
Stochastic First-order Methods for Convex and Nonconvex Functional
  Constrained Optimization
Stochastic First-order Methods for Convex and Nonconvex Functional Constrained Optimization
Digvijay Boob
Qi Deng
Guanghui Lan
87
96
0
07 Aug 2019
Stochastic algorithms with geometric step decay converge linearly on
  sharp functions
Stochastic algorithms with geometric step decay converge linearly on sharp functions
Damek Davis
Dmitriy Drusvyatskiy
Vasileios Charisopoulos
73
28
0
22 Jul 2019
A Hybrid Stochastic Optimization Framework for Stochastic Composite
  Nonconvex Optimization
A Hybrid Stochastic Optimization Framework for Stochastic Composite Nonconvex Optimization
Quoc Tran-Dinh
Nhan H. Pham
T. Dzung
Lam M. Nguyen
80
51
0
08 Jul 2019
Global Optimality Guarantees For Policy Gradient Methods
Global Optimality Guarantees For Policy Gradient Methods
Jalaj Bhandari
Daniel Russo
109
194
0
05 Jun 2019
On the Computation and Communication Complexity of Parallel SGD with
  Dynamic Batch Sizes for Stochastic Non-Convex Optimization
On the Computation and Communication Complexity of Parallel SGD with Dynamic Batch Sizes for Stochastic Non-Convex Optimization
Hao Yu
Rong Jin
81
51
0
10 May 2019
Stochastic Optimization for DC Functions and Non-smooth Non-convex
  Regularizers with Non-asymptotic Convergence
Stochastic Optimization for DC Functions and Non-smooth Non-convex Regularizers with Non-asymptotic Convergence
Yi Tian Xu
Qi Qi
Qihang Lin
Rong Jin
Tianbao Yang
87
42
0
28 Nov 2018
Weakly-Convex Concave Min-Max Optimization: Provable Algorithms and
  Applications in Machine Learning
Weakly-Convex Concave Min-Max Optimization: Provable Algorithms and Applications in Machine Learning
Hassan Rafique
Mingrui Liu
Qihang Lin
Tianbao Yang
121
111
0
04 Oct 2018
Stochastic subgradient method converges at the rate $O(k^{-1/4})$ on
  weakly convex functions
Stochastic subgradient method converges at the rate O(k−1/4)O(k^{-1/4})O(k−1/4) on weakly convex functions
Damek Davis
Dmitriy Drusvyatskiy
133
101
0
08 Feb 2018
Convergence Rates for Deterministic and Stochastic Subgradient Methods
  Without Lipschitz Continuity
Convergence Rates for Deterministic and Stochastic Subgradient Methods Without Lipschitz Continuity
Benjamin Grimmer
78
42
0
12 Dec 2017
Stochastic Methods for Composite and Weakly Convex Optimization Problems
Stochastic Methods for Composite and Weakly Convex Optimization Problems
John C. Duchi
Feng Ruan
69
127
0
24 Mar 2017
1