ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1909.03172
  4. Cited By
Towards Understanding the Importance of Noise in Training Neural
  Networks

Towards Understanding the Importance of Noise in Training Neural Networks

7 September 2019
Mo Zhou
Tianyi Liu
Yan Li
Dachao Lin
Enlu Zhou
T. Zhao
    MLT
ArXiv (abs)PDFHTML

Papers citing "Towards Understanding the Importance of Noise in Training Neural Networks"

10 / 10 papers shown
Title
Beyond Single-Model Views for Deep Learning: Optimization versus
  Generalizability of Stochastic Optimization Algorithms
Beyond Single-Model Views for Deep Learning: Optimization versus Generalizability of Stochastic Optimization Algorithms
Toki Tahmid Inan
Mingrui Liu
Amarda Shehu
59
0
0
01 Mar 2024
Learning Enriched Illuminants for Cross and Single Sensor Color
  Constancy
Learning Enriched Illuminants for Cross and Single Sensor Color Constancy
Xiaodong Cun
Zhendong Wang
Chi-Man Pun
Jian-zhuo Liu
Wen-gang Zhou
Xu Jia
Houqiang Li
50
4
0
21 Mar 2022
Noise Regularizes Over-parameterized Rank One Matrix Recovery, Provably
Noise Regularizes Over-parameterized Rank One Matrix Recovery, Provably
Tianyi Liu
Yan Li
Enlu Zhou
Tuo Zhao
62
1
0
07 Feb 2022
PAGE-PG: A Simple and Loopless Variance-Reduced Policy Gradient Method
  with Probabilistic Gradient Estimation
PAGE-PG: A Simple and Loopless Variance-Reduced Policy Gradient Method with Probabilistic Gradient Estimation
Matilde Gargiani
Andrea Zanelli
Andrea Martinelli
Tyler H. Summers
John Lygeros
71
14
0
01 Feb 2022
Noisy Gradient Descent Converges to Flat Minima for Nonconvex Matrix
  Factorization
Noisy Gradient Descent Converges to Flat Minima for Nonconvex Matrix Factorization
Tianyi Liu
Yan Li
S. Wei
Enlu Zhou
T. Zhao
65
13
0
24 Feb 2021
Artificial Neural Variability for Deep Learning: On Overfitting, Noise
  Memorization, and Catastrophic Forgetting
Artificial Neural Variability for Deep Learning: On Overfitting, Noise Memorization, and Catastrophic Forgetting
Zeke Xie
Fengxiang He
Shaopeng Fu
Issei Sato
Dacheng Tao
Masashi Sugiyama
56
61
0
12 Nov 2020
Review: Deep Learning in Electron Microscopy
Review: Deep Learning in Electron Microscopy
Jeffrey M. Ede
191
80
0
17 Sep 2020
RIFLE: Backpropagation in Depth for Deep Transfer Learning through
  Re-Initializing the Fully-connected LayEr
RIFLE: Backpropagation in Depth for Deep Transfer Learning through Re-Initializing the Fully-connected LayEr
Xingjian Li
Haoyi Xiong
Haozhe An
Chengzhong Xu
Dejing Dou
ODL
69
39
0
07 Jul 2020
Towards Understanding the Importance of Shortcut Connections in Residual
  Networks
Towards Understanding the Importance of Shortcut Connections in Residual Networks
Tianyi Liu
Minshuo Chen
Mo Zhou
S. Du
Enlu Zhou
T. Zhao
49
45
0
10 Sep 2019
Neural Proximal/Trust Region Policy Optimization Attains Globally
  Optimal Policy
Neural Proximal/Trust Region Policy Optimization Attains Globally Optimal Policy
Boyi Liu
Qi Cai
Zhuoran Yang
Zhaoran Wang
90
111
0
25 Jun 2019
1