ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1704.00196
  4. Cited By
Faster Subgradient Methods for Functions with Hölderian Growth
v1v2v3 (latest)

Faster Subgradient Methods for Functions with Hölderian Growth

1 April 2017
Patrick R. Johnstone
P. Moulin
ArXiv (abs)PDFHTML

Papers citing "Faster Subgradient Methods for Functions with Hölderian Growth"

6 / 6 papers shown
Title
Beyond O(T)\mathcal{O}(\sqrt{T})O(T​) Regret: Decoupling Learning and Decision-making in Online Linear Programming
Wenzhi Gao
Dongdong Ge
Chenyu Xue
Chunlin Sun
Yinyu Ye
57
0
0
06 Jan 2025
Some Primal-Dual Theory for Subgradient Methods for Strongly Convex Optimization
Some Primal-Dual Theory for Subgradient Methods for Strongly Convex Optimization
Benjamin Grimmer
Danlin Li
114
6
0
31 Dec 2024
Federated Learning on Adaptively Weighted Nodes by Bilevel Optimization
Federated Learning on Adaptively Weighted Nodes by Bilevel Optimization
Yan Huang
Qihang Lin
N. Street
Stephen Seung-Yeob Baek
FedML
75
9
0
21 Jul 2022
Stochastic algorithms with geometric step decay converge linearly on
  sharp functions
Stochastic algorithms with geometric step decay converge linearly on sharp functions
Damek Davis
Dmitriy Drusvyatskiy
Vasileios Charisopoulos
73
28
0
22 Jul 2019
An Optimal-Storage Approach to Semidefinite Programming using
  Approximate Complementarity
An Optimal-Storage Approach to Semidefinite Programming using Approximate Complementarity
Lijun Ding
A. Yurtsever
Volkan Cevher
J. Tropp
Madeleine Udell
61
43
0
09 Feb 2019
AsySPA: An Exact Asynchronous Algorithm for Convex Optimization Over
  Digraphs
AsySPA: An Exact Asynchronous Algorithm for Convex Optimization Over Digraphs
Jiaqi Zhang
Keyou You
48
74
0
13 Aug 2018
1