ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1611.01957
69
11
v1v2v3 (latest)

Linear Convergence of SVRG in Statistical Estimation

7 November 2016
Chao Qu
Yan Li
ArXiv (abs)PDFHTML
Abstract

SVRG and its variants are among the state of art optimization algorithms for the large scale machine learning problem. It is well known that SVRG converges linearly when the objective function is strongly convex. However this setup does not include several important formulations such as Lasso, group Lasso, logistic regression, among others. In this paper, we prove that, for a class of statistical M-estimators where {\em strong convexity does not hold}, SVRG can solve the formulation with {\em a linear convergence rate}. Our analysis makes use of {\em restricted strong convexity}, under which we show that SVRG converges linearly to the fundamental statistical precision of the model, i.e., the difference between true unknown parameter θ∗\theta^*θ∗ and the optimal solution θ^\hat{\theta}θ^ of the model. This improves previous convergence analysis on the non-strongly convex setup that achieves sub-linear convergence rate.

View on arXiv
Comments on this paper