ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1309.5885
  4. Cited By
Smooth minimization of nonsmooth functions with parallel coordinate
  descent methods

Smooth minimization of nonsmooth functions with parallel coordinate descent methods

23 September 2013
Olivier Fercoq
Peter Richtárik
ArXiv (abs)PDFHTML

Papers citing "Smooth minimization of nonsmooth functions with parallel coordinate descent methods"

7 / 7 papers shown
Title
Towards a Better Theoretical Understanding of Independent Subnetwork
  Training
Towards a Better Theoretical Understanding of Independent Subnetwork Training
Egor Shulgin
Peter Richtárik
AI4CE
106
6
0
28 Jun 2023
Stochastic Coordinate Minimization with Progressive Precision for
  Stochastic Convex Optimization
Stochastic Coordinate Minimization with Progressive Precision for Stochastic Convex Optimization
Sudeep Salgia
Qing Zhao
Sattar Vakili
77
2
0
11 Mar 2020
Online and Batch Supervised Background Estimation via L1 Regression
Online and Batch Supervised Background Estimation via L1 Regression
Aritra Dutta
Peter Richtárik
49
8
0
23 Nov 2017
Smooth Primal-Dual Coordinate Descent Algorithms for Nonsmooth Convex
  Optimization
Smooth Primal-Dual Coordinate Descent Algorithms for Nonsmooth Convex Optimization
Ahmet Alacaoglu
Quoc Tran-Dinh
Olivier Fercoq
Volkan Cevher
72
30
0
09 Nov 2017
Stochastic, Distributed and Federated Optimization for Machine Learning
Stochastic, Distributed and Federated Optimization for Machine Learning
Jakub Konecný
FedML
70
38
0
04 Jul 2017
An optimal randomized incremental gradient method
An optimal randomized incremental gradient method
Guanghui Lan
Yi Zhou
157
220
0
08 Jul 2015
Semi-Stochastic Gradient Descent Methods
Semi-Stochastic Gradient Descent Methods
Jakub Konecný
Peter Richtárik
ODL
154
240
0
05 Dec 2013
1