ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1706.07001
  4. Cited By
Improved Optimization of Finite Sums with Minibatch Stochastic Variance
  Reduced Proximal Iterations

Improved Optimization of Finite Sums with Minibatch Stochastic Variance Reduced Proximal Iterations

21 June 2017
Jialei Wang
Tong Zhang
ArXivPDFHTML

Papers citing "Improved Optimization of Finite Sums with Minibatch Stochastic Variance Reduced Proximal Iterations"

5 / 5 papers shown
Title
Distributed Bootstrap for Simultaneous Inference Under High
  Dimensionality
Distributed Bootstrap for Simultaneous Inference Under High Dimensionality
Yang Yu
Shih-Kang Chao
Guang Cheng
FedML
40
10
0
19 Feb 2021
First-order Newton-type Estimator for Distributed Estimation and
  Inference
First-order Newton-type Estimator for Distributed Estimation and Inference
Xi Chen
Weidong Liu
Yichen Zhang
37
48
0
28 Nov 2018
Gradient Sparsification for Communication-Efficient Distributed
  Optimization
Gradient Sparsification for Communication-Efficient Distributed Optimization
Jianqiao Wangni
Jialei Wang
Ji Liu
Tong Zhang
15
522
0
26 Oct 2017
A Proximal Stochastic Gradient Method with Progressive Variance
  Reduction
A Proximal Stochastic Gradient Method with Progressive Variance Reduction
Lin Xiao
Tong Zhang
ODL
93
737
0
19 Mar 2014
Optimal Distributed Online Prediction using Mini-Batches
Optimal Distributed Online Prediction using Mini-Batches
O. Dekel
Ran Gilad-Bachrach
Ohad Shamir
Lin Xiao
186
683
0
07 Dec 2010
1