ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1412.8063
  4. Cited By
Coordinate Descent with Arbitrary Sampling II: Expected Separable
  Overapproximation

Coordinate Descent with Arbitrary Sampling II: Expected Separable Overapproximation

27 December 2014
Zheng Qu
Peter Richtárik
ArXivPDFHTML

Papers citing "Coordinate Descent with Arbitrary Sampling II: Expected Separable Overapproximation"

13 / 13 papers shown
Title
Laplacian-based Semi-Supervised Learning in Multilayer Hypergraphs by
  Coordinate Descent
Laplacian-based Semi-Supervised Learning in Multilayer Hypergraphs by Coordinate Descent
Sara Venturini
Andrea Cristofari
Francesco Rinaldi
Francesco Tudisco
39
2
0
28 Jan 2023
GradSkip: Communication-Accelerated Local Gradient Methods with Better
  Computational Complexity
GradSkip: Communication-Accelerated Local Gradient Methods with Better Computational Complexity
Artavazd Maranjyan
M. Safaryan
Peter Richtárik
39
13
0
28 Oct 2022
Optimization for Supervised Machine Learning: Randomized Algorithms for
  Data and Parameters
Optimization for Supervised Machine Learning: Randomized Algorithms for Data and Parameters
Filip Hanzely
42
0
0
26 Aug 2020
Precise expressions for random projections: Low-rank approximation and
  randomized Newton
Precise expressions for random projections: Low-rank approximation and randomized Newton
Michal Derezinski
Feynman T. Liang
Zhenyu A. Liao
Michael W. Mahoney
34
23
0
18 Jun 2020
Convergence Analysis of Block Coordinate Algorithms with Determinantal
  Sampling
Convergence Analysis of Block Coordinate Algorithms with Determinantal Sampling
Mojmír Mutný
Michal Derezinski
Andreas Krause
38
21
0
25 Oct 2019
99% of Distributed Optimization is a Waste of Time: The Issue and How to
  Fix it
99% of Distributed Optimization is a Waste of Time: The Issue and How to Fix it
Konstantin Mishchenko
Filip Hanzely
Peter Richtárik
16
13
0
27 Jan 2019
SAGA with Arbitrary Sampling
SAGA with Arbitrary Sampling
Xun Qian
Zheng Qu
Peter Richtárik
37
25
0
24 Jan 2019
SEGA: Variance Reduction via Gradient Sketching
SEGA: Variance Reduction via Gradient Sketching
Filip Hanzely
Konstantin Mishchenko
Peter Richtárik
25
71
0
09 Sep 2018
Momentum and Stochastic Momentum for Stochastic Gradient, Newton,
  Proximal Point and Subspace Descent Methods
Momentum and Stochastic Momentum for Stochastic Gradient, Newton, Proximal Point and Subspace Descent Methods
Nicolas Loizou
Peter Richtárik
24
200
0
27 Dec 2017
A Primer on Coordinate Descent Algorithms
A Primer on Coordinate Descent Algorithms
Hao-Jun Michael Shi
Shenyinying Tu
Yangyang Xu
W. Yin
40
90
0
30 Sep 2016
Importance Sampling for Minibatches
Importance Sampling for Minibatches
Dominik Csiba
Peter Richtárik
32
113
0
06 Feb 2016
Stochastic Dual Coordinate Ascent with Adaptive Probabilities
Stochastic Dual Coordinate Ascent with Adaptive Probabilities
Dominik Csiba
Zheng Qu
Peter Richtárik
ODL
61
97
0
27 Feb 2015
Parallel and Distributed Block-Coordinate Frank-Wolfe Algorithms
Parallel and Distributed Block-Coordinate Frank-Wolfe Algorithms
Yu Wang
Veeranjaneyulu Sadhanala
Wei-Ming Dai
Willie Neiswanger
S. Sra
Eric Xing
53
44
0
22 Sep 2014
1