ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2302.12261
  4. Cited By
Testing Stationarity Concepts for ReLU Networks: Hardness, Regularity,
  and Robust Algorithms

Testing Stationarity Concepts for ReLU Networks: Hardness, Regularity, and Robust Algorithms

23 February 2023
Lai Tian
Anthony Man-Cho So
ArXivPDFHTML

Papers citing "Testing Stationarity Concepts for ReLU Networks: Hardness, Regularity, and Robust Algorithms"

3 / 3 papers shown
Title
On the Complexity of Finding Small Subgradients in Nonsmooth Optimization
On the Complexity of Finding Small Subgradients in Nonsmooth Optimization
Guy Kornowski
Ohad Shamir
27
9
0
21 Sep 2022
Gradient-Free Methods for Deterministic and Stochastic Nonsmooth
  Nonconvex Optimization
Gradient-Free Methods for Deterministic and Stochastic Nonsmooth Nonconvex Optimization
Tianyi Lin
Zeyu Zheng
Michael I. Jordan
41
50
0
12 Sep 2022
On the Effective Number of Linear Regions in Shallow Univariate ReLU
  Networks: Convergence Guarantees and Implicit Bias
On the Effective Number of Linear Regions in Shallow Univariate ReLU Networks: Convergence Guarantees and Implicit Bias
Itay Safran
Gal Vardi
Jason D. Lee
MLT
41
23
0
18 May 2022
1