ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.05725
  4. Cited By
If dropout limits trainable depth, does critical initialisation still
  matter? A large-scale statistical analysis on ReLU networks
v1v2 (latest)

If dropout limits trainable depth, does critical initialisation still matter? A large-scale statistical analysis on ReLU networks

Pattern Recognition Letters (PR), 2019
13 October 2019
Arnu Pretorius
Elan Van Biljon
Benjamin van Niekerk
Ryan Eloff
Matthew Reynard
Steven D. James
Benjamin Rosman
Herman Kamper
Steve Kroon
ArXiv (abs)PDFHTML

Papers citing "If dropout limits trainable depth, does critical initialisation still matter? A large-scale statistical analysis on ReLU networks"

2 / 2 papers shown
Procedural Content Generation using Neuroevolution and Novelty Search
  for Diverse Video Game Levels
Procedural Content Generation using Neuroevolution and Novelty Search for Diverse Video Game LevelsAnnual Conference on Genetic and Evolutionary Computation (GECCO), 2022
Michael Beukman
C. Cleghorn
Steven D. James
147
20
0
14 Apr 2022
On the expected behaviour of noise regularised deep neural networks as
  Gaussian processes
On the expected behaviour of noise regularised deep neural networks as Gaussian processesPattern Recognition Letters (PR), 2019
Arnu Pretorius
Herman Kamper
Steve Kroon
166
9
0
12 Oct 2019
1