ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.01978
  4. Cited By
Asymptotic convergence rate of Dropout on shallow linear neural networks

Asymptotic convergence rate of Dropout on shallow linear neural networks

Measurement and Modeling of Computer Systems (SIGMETRICS), 2020
1 December 2020
Albert Senen-Cerda
J. Sanders
ArXiv (abs)PDFHTML

Papers citing "Asymptotic convergence rate of Dropout on shallow linear neural networks"

4 / 4 papers shown
Singular-limit analysis of gradient descent with noise injection
Singular-limit analysis of gradient descent with noise injection
Anna Shalova
André Schlichting
M. Peletier
279
6
0
18 Apr 2024
On the Convergence of Shallow Neural Network Training with Randomly
  Masked Neurons
On the Convergence of Shallow Neural Network Training with Randomly Masked Neurons
Fangshuo Liao
Anastasios Kyrillidis
390
16
0
05 Dec 2021
Masked Training of Neural Networks with Partial Gradients
Masked Training of Neural Networks with Partial Gradients
Amirkeivan Mohtashami
Martin Jaggi
Sebastian U. Stich
414
28
0
16 Jun 2021
Almost Sure Convergence of Dropout Algorithms for Neural Networks
Almost Sure Convergence of Dropout Algorithms for Neural Networks
Albert Senen-Cerda
J. Sanders
296
11
0
06 Feb 2020
1
Page 1 of 1