ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.16830
  4. Cited By
Sampling weights of deep neural networks
v1v2 (latest)

Sampling weights of deep neural networks

Neural Information Processing Systems (NeurIPS), 2023
29 June 2023
Iryna Burak
Erik Lien Bolager
Chinmay Datar
Q. Sun
Felix Dietrich
    BDLUQCV
ArXiv (abs)PDFHTML

Papers citing "Sampling weights of deep neural networks"

10 / 10 papers shown
Random Feature Spiking Neural Networks
Random Feature Spiking Neural Networks
Maximilian Gollwitzer
Felix Dietrich
141
0
0
01 Oct 2025
Hyper Diffusion Avatars: Dynamic Human Avatar Generation using Network Weight Space Diffusion
Hyper Diffusion Avatars: Dynamic Human Avatar Generation using Network Weight Space Diffusion
Dongliang Cao
Guoxing Sun
Marc Habermann
Florian Bernard
186
1
0
04 Sep 2025
Rapid training of Hamiltonian graph networks using random features
Rapid training of Hamiltonian graph networks using random features
Atamert Rahma
Chinmay Datar
Ana Cukarska
Felix Dietrich
AI4CE
248
0
0
06 Jun 2025
Training Hamiltonian neural networks without backpropagation
Training Hamiltonian neural networks without backpropagation
Atamert Rahma
Chinmay Datar
Felix Dietrich
287
4
0
26 Nov 2024
Generative Feature Training of Thin 2-Layer Networks
Generative Feature Training of Thin 2-Layer Networks
J. Hertrich
Sebastian Neumayer
GAN
423
2
0
11 Nov 2024
Fast training of accurate physics-informed neural networks without gradient descent
Fast training of accurate physics-informed neural networks without gradient descent
Chinmay Datar
Taniya Kapoor
Abhishek Chandra
Q. Sun
Erik Lien Bolager
Iryna Burak
Anna Veselovska
Massimo Fornasier
Felix Dietrich
324
5
0
31 May 2024
Multifidelity linear regression for scientific machine learning from
  scarce data
Multifidelity linear regression for scientific machine learning from scarce dataFoundations of Data Science (FDS), 2024
Elizabeth Qian
Dayoung Kang
Vignesh Sella
Anirban Chaudhuri
AI4CE
273
4
0
13 Mar 2024
Leveraging Continuously Differentiable Activation Functions for Learning in Quantized Noisy Environments
Leveraging Continuously Differentiable Activation Functions for Learning in Quantized Noisy Environments
Vivswan Shah
Nathan Youngblood
349
3
0
04 Feb 2024
Deep Learning in Deterministic Computational Mechanics
Deep Learning in Deterministic Computational Mechanics
L. Herrmann
Stefan Kollmannsberger
AI4CEPINN
303
1
0
27 Sep 2023
Xception: Deep Learning with Depthwise Separable Convolutions
Xception: Deep Learning with Depthwise Separable ConvolutionsComputer Vision and Pattern Recognition (CVPR), 2016
François Chollet
MDEBDLPINN
3.0K
16,604
0
07 Oct 2016
1