ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1911.01413
  4. Cited By
Sub-Optimal Local Minima Exist for Neural Networks with Almost All
  Non-Linear Activations
v1v2v3 (latest)

Sub-Optimal Local Minima Exist for Neural Networks with Almost All Non-Linear Activations

4 November 2019
Tian Ding
Dawei Li
Tian Ding
ArXiv (abs)PDFHTML

Papers citing "Sub-Optimal Local Minima Exist for Neural Networks with Almost All Non-Linear Activations"

8 / 8 papers shown
When Expressivity Meets Trainability: Fewer than $n$ Neurons Can Work
When Expressivity Meets Trainability: Fewer than nnn Neurons Can WorkNeural Information Processing Systems (NeurIPS), 2022
Jiawei Zhang
Yushun Zhang
Mingyi Hong
Tian Ding
Jianfeng Yao
369
11
0
21 Oct 2022
On the Omnipresence of Spurious Local Minima in Certain Neural Network
  Training Problems
On the Omnipresence of Spurious Local Minima in Certain Neural Network Training ProblemsConstructive approximation (Constr. Approx.), 2022
C. Christof
Julia Kowalczyk
401
10
0
23 Feb 2022
Exponentially Many Local Minima in Quantum Neural Networks
Exponentially Many Local Minima in Quantum Neural Networks
Xuchen You
Xiaodi Wu
368
64
0
06 Oct 2021
How to Inject Backdoors with Better Consistency: Logit Anchoring on
  Clean Data
How to Inject Backdoors with Better Consistency: Logit Anchoring on Clean Data
Zhiyuan Zhang
Lingjuan Lyu
Weiqiang Wang
Lichao Sun
Xu Sun
235
40
0
03 Sep 2021
Towards a Better Global Loss Landscape of GANs
Towards a Better Global Loss Landscape of GANs
Tian Ding
Tiantian Fang
Alex Schwing
GAN
292
36
0
10 Nov 2020
The Global Landscape of Neural Networks: An Overview
The Global Landscape of Neural Networks: An Overview
Tian Ding
Dawei Li
Shiyu Liang
Tian Ding
R. Srikant
268
97
0
02 Jul 2020
The critical locus of overparameterized neural networks
The critical locus of overparameterized neural networks
Y. Cooper
UQCV
272
11
0
08 May 2020
Critical Point-Finding Methods Reveal Gradient-Flat Regions of Deep
  Network Losses
Critical Point-Finding Methods Reveal Gradient-Flat Regions of Deep Network LossesNeural Computation (Neural Comput.), 2020
Charles G. Frye
James B. Simon
Neha S. Wadia
A. Ligeralde
M. DeWeese
K. Bouchard
ODL
240
4
0
23 Mar 2020
1
Page 1 of 1