ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.01496
  4. Cited By
Zeroth-Order Negative Curvature Finding: Escaping Saddle Points without
  Gradients

Zeroth-Order Negative Curvature Finding: Escaping Saddle Points without Gradients

4 October 2022
Hualin Zhang
Huan Xiong
Bin Gu
ArXivPDFHTML

Papers citing "Zeroth-Order Negative Curvature Finding: Escaping Saddle Points without Gradients"

4 / 4 papers shown
Title
A Historical Trajectory Assisted Optimization Method for Zeroth-Order
  Federated Learning
A Historical Trajectory Assisted Optimization Method for Zeroth-Order Federated Learning
Chenlin Wu
Xiaoyu He
Zike Li
Zibin Zheng
Zibin Zheng
FedML
24
0
0
24 Sep 2024
How to Boost Any Loss Function
How to Boost Any Loss Function
Richard Nock
Yishay Mansour
29
0
0
02 Jul 2024
Comparisons Are All You Need for Optimizing Smooth Functions
Comparisons Are All You Need for Optimizing Smooth Functions
Chenyi Zhang
Tongyang Li
AAML
24
1
0
19 May 2024
Restarted Nonconvex Accelerated Gradient Descent: No More
  Polylogarithmic Factor in the $O(ε^{-7/4})$ Complexity
Restarted Nonconvex Accelerated Gradient Descent: No More Polylogarithmic Factor in the O(ε−7/4)O(ε^{-7/4})O(ε−7/4) Complexity
Huan Li
Zhouchen Lin
37
21
0
27 Jan 2022
1