ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2303.12277
  4. Cited By
Stochastic Nonsmooth Convex Optimization with Heavy-Tailed Noises:
  High-Probability Bound, In-Expectation Rate and Initial Distance Adaptation

Stochastic Nonsmooth Convex Optimization with Heavy-Tailed Noises: High-Probability Bound, In-Expectation Rate and Initial Distance Adaptation

22 March 2023
Zijian Liu
Zhengyuan Zhou
ArXivPDFHTML

Papers citing "Stochastic Nonsmooth Convex Optimization with Heavy-Tailed Noises: High-Probability Bound, In-Expectation Rate and Initial Distance Adaptation"

3 / 3 papers shown
Title
Limit Theorems for Stochastic Gradient Descent with Infinite Variance
Limit Theorems for Stochastic Gradient Descent with Infinite Variance
Jose H. Blanchet
Aleksandar Mijatović
Wenhao Yang
21
0
0
21 Oct 2024
From Gradient Clipping to Normalization for Heavy Tailed SGD
From Gradient Clipping to Normalization for Heavy Tailed SGD
Florian Hübler
Ilyas Fatkhullin
Niao He
40
5
0
17 Oct 2024
Differential Private Stochastic Optimization with Heavy-tailed Data:
  Towards Optimal Rates
Differential Private Stochastic Optimization with Heavy-tailed Data: Towards Optimal Rates
Puning Zhao
Jiafei Wu
Zhe Liu
Chong Wang
Rongfei Fan
Qingming Li
40
1
0
19 Aug 2024
1