ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2111.14069
  4. Cited By
Escape saddle points by a simple gradient-descent based algorithm

Escape saddle points by a simple gradient-descent based algorithm

28 November 2021
Chenyi Zhang
Tongyang Li
    ODL
ArXivPDFHTML

Papers citing "Escape saddle points by a simple gradient-descent based algorithm"

4 / 4 papers shown
Title
Comparisons Are All You Need for Optimizing Smooth Functions
Comparisons Are All You Need for Optimizing Smooth Functions
Chenyi Zhang
Tongyang Li
AAML
24
1
0
19 May 2024
Neural Langevin Dynamics: towards interpretable Neural Stochastic
  Differential Equations
Neural Langevin Dynamics: towards interpretable Neural Stochastic Differential Equations
Simon Koop
M. Peletier
J. Portegies
Vlado Menkovski
DiffM
22
1
0
17 Nov 2022
Zeroth-Order Negative Curvature Finding: Escaping Saddle Points without
  Gradients
Zeroth-Order Negative Curvature Finding: Escaping Saddle Points without Gradients
Hualin Zhang
Huan Xiong
Bin Gu
22
7
0
04 Oct 2022
On Quantum Speedups for Nonconvex Optimization via Quantum Tunneling
  Walks
On Quantum Speedups for Nonconvex Optimization via Quantum Tunneling Walks
Yizhou Liu
Weijie J. Su
Tongyang Li
16
17
0
29 Sep 2022
1