ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.08958
  4. Cited By
AutoInit: Analytic Signal-Preserving Weight Initialization for Neural
  Networks

AutoInit: Analytic Signal-Preserving Weight Initialization for Neural Networks

18 September 2021
G. Bingham
Risto Miikkulainen
    ODL
ArXivPDFHTML

Papers citing "AutoInit: Analytic Signal-Preserving Weight Initialization for Neural Networks"

5 / 5 papers shown
Title
Initialization of Large Language Models via Reparameterization to
  Mitigate Loss Spikes
Initialization of Large Language Models via Reparameterization to Mitigate Loss Spikes
Kosuke Nishida
Kyosuke Nishida
Kuniko Saito
28
1
0
07 Oct 2024
Efficient Activation Function Optimization through Surrogate Modeling
Efficient Activation Function Optimization through Surrogate Modeling
G. Bingham
Risto Miikkulainen
16
2
0
13 Jan 2023
ResNet strikes back: An improved training procedure in timm
ResNet strikes back: An improved training procedure in timm
Ross Wightman
Hugo Touvron
Hervé Jégou
AI4TS
207
487
0
01 Oct 2021
Improving Autoencoder Training Performance for Hyperspectral Unmixing
  with Network Reinitialisation
Improving Autoencoder Training Performance for Hyperspectral Unmixing with Network Reinitialisation
Kamil Książek
P. Głomb
M. Romaszewski
M. Cholewa
B. Grabowski
Krisztián Búza
33
3
0
28 Sep 2021
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train
  10,000-Layer Vanilla Convolutional Neural Networks
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks
Lechao Xiao
Yasaman Bahri
Jascha Narain Sohl-Dickstein
S. Schoenholz
Jeffrey Pennington
220
348
0
14 Jun 2018
1