ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.03909
  4. Cited By
Wide stable neural networks: Sample regularity, functional convergence
  and Bayesian inverse problems

Wide stable neural networks: Sample regularity, functional convergence and Bayesian inverse problems

4 July 2024
Tomás Soto
ArXivPDFHTML

Papers citing "Wide stable neural networks: Sample regularity, functional convergence and Bayesian inverse problems"

3 / 3 papers shown
Title
Deep neural networks with dependent weights: Gaussian Process mixture
  limit, heavy tails, sparsity and compressibility
Deep neural networks with dependent weights: Gaussian Process mixture limit, heavy tails, sparsity and compressibility
Hoileong Lee
Fadhel Ayed
Paul Jung
Juho Lee
Hongseok Yang
François Caron
25
8
0
17 May 2022
Random tree Besov priors -- Towards fractal imaging
Random tree Besov priors -- Towards fractal imaging
Hanne Kekkonen
Matti Lassas
E. Saksman
S. Siltanen
19
8
0
28 Feb 2021
Non-asymptotic approximations of neural networks by Gaussian processes
Non-asymptotic approximations of neural networks by Gaussian processes
Ronen Eldan
Dan Mikulincer
T. Schramm
20
24
0
17 Feb 2021
1