ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2201.12724
  4. Cited By
Stochastic Neural Networks with Infinite Width are Deterministic

Stochastic Neural Networks with Infinite Width are Deterministic

30 January 2022
Liu Ziyin
Hanlin Zhang
Xiangming Meng
Yuting Lu
Eric P. Xing
Masakuni Ueda
ArXivPDFHTML

Papers citing "Stochastic Neural Networks with Infinite Width are Deterministic"

6 / 6 papers shown
Title
On the Convergence Analysis of Over-Parameterized Variational
  Autoencoders: A Neural Tangent Kernel Perspective
On the Convergence Analysis of Over-Parameterized Variational Autoencoders: A Neural Tangent Kernel Perspective
Li Wang
Wei Huang
DRL
11
0
0
09 Sep 2024
Posterior Collapse of a Linear Latent Variable Model
Posterior Collapse of a Linear Latent Variable Model
Zihao W. Wang
Liu Ziyin
BDL
17
21
0
09 May 2022
Exact Solutions of a Deep Linear Network
Exact Solutions of a Deep Linear Network
Liu Ziyin
Botao Li
Xiangmin Meng
ODL
11
21
0
10 Feb 2022
Dropout: Explicit Forms and Capacity Control
Dropout: Explicit Forms and Capacity Control
R. Arora
Peter L. Bartlett
Poorya Mianjy
Nathan Srebro
50
33
0
06 Mar 2020
Fast and Scalable Bayesian Deep Learning by Weight-Perturbation in Adam
Fast and Scalable Bayesian Deep Learning by Weight-Perturbation in Adam
Mohammad Emtiyaz Khan
Didrik Nielsen
Voot Tangkaratt
Wu Lin
Y. Gal
Akash Srivastava
ODL
74
264
0
13 Jun 2018
Dropout as a Bayesian Approximation: Representing Model Uncertainty in
  Deep Learning
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Y. Gal
Zoubin Ghahramani
UQCV
BDL
245
9,042
0
06 Jun 2015
1