ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2205.12379
  4. Cited By
Gaussian Pre-Activations in Neural Networks: Myth or Reality?

Gaussian Pre-Activations in Neural Networks: Myth or Reality?

24 May 2022
Pierre Wolinski
Julyan Arbel
    AI4CE
ArXivPDFHTML

Papers citing "Gaussian Pre-Activations in Neural Networks: Myth or Reality?"

6 / 6 papers shown
Title
VeLU: Variance-enhanced Learning Unit for Deep Neural Networks
VeLU: Variance-enhanced Learning Unit for Deep Neural Networks
Ashkan Shakarami
Yousef Yeganeh
Azade Farshad
Lorenzo Nicolè
Stefano Ghidoni
Nassir Navab
44
0
0
21 Apr 2025
Streamlining Prediction in Bayesian Deep Learning
Streamlining Prediction in Bayesian Deep Learning
Rui Li
Marcus Klasson
Arno Solin
Martin Trapp
UQCV
BDL
82
1
0
27 Nov 2024
Commutative Width and Depth Scaling in Deep Neural Networks
Commutative Width and Depth Scaling in Deep Neural Networks
Soufiane Hayou
38
2
0
02 Oct 2023
A Primer on Bayesian Neural Networks: Review and Debates
A Primer on Bayesian Neural Networks: Review and Debates
Federico Danieli
Konstantinos Pitas
M. Vladimirova
Vincent Fortuin
BDL
AAML
44
17
0
28 Sep 2023
Width and Depth Limits Commute in Residual Networks
Width and Depth Limits Commute in Residual Networks
Soufiane Hayou
Greg Yang
40
13
0
01 Feb 2023
Bayesian neural network unit priors and generalized Weibull-tail
  property
Bayesian neural network unit priors and generalized Weibull-tail property
M. Vladimirova
Julyan Arbel
Stéphane Girard
BDL
47
9
0
06 Oct 2021
1