ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1905.02898
  4. Cited By
A Generative Model for Sampling High-Performance and Diverse Weights for
  Neural Networks

A Generative Model for Sampling High-Performance and Diverse Weights for Neural Networks

7 May 2019
Lior Deutsch
Erik Nijkamp
Yu Yang
ArXivPDFHTML

Papers citing "A Generative Model for Sampling High-Performance and Diverse Weights for Neural Networks"

8 / 8 papers shown
Title
Individualised Treatment Effects Estimation with Composite Treatments and Composite Outcomes
Individualised Treatment Effects Estimation with Composite Treatments and Composite Outcomes
V. Chauhan
Lei A. Clifton
Gaurav Nigam
David Clifton
CML
63
0
0
12 Feb 2025
Principled Weight Initialization for Hypernetworks
Principled Weight Initialization for Hypernetworks
Oscar Chang
Lampros Flokas
Hod Lipson
30
73
0
13 Dec 2023
Dynamic Inter-treatment Information Sharing for Individualized Treatment
  Effects Estimation
Dynamic Inter-treatment Information Sharing for Individualized Treatment Effects Estimation
V. Chauhan
Jiandong Zhou
Ghadeer O. Ghosheh
Soheila Molaei
David Clifton
30
8
0
25 May 2023
Permutation Equivariant Neural Functionals
Permutation Equivariant Neural Functionals
Allan Zhou
Kaien Yang
Kaylee Burns
Adriano Cardace
Yiding Jiang
Samuel Sokota
J. Zico Kolter
Chelsea Finn
35
47
0
27 Feb 2023
Learning to Learn with Generative Models of Neural Network Checkpoints
Learning to Learn with Generative Models of Neural Network Checkpoints
William S. Peebles
Ilija Radosavovic
Tim Brooks
Alexei A. Efros
Jitendra Malik
UQCV
75
65
0
26 Sep 2022
Neural Architecture Search with Reinforcement Learning
Neural Architecture Search with Reinforcement Learning
Barret Zoph
Quoc V. Le
274
5,330
0
05 Nov 2016
Dropout as a Bayesian Approximation: Representing Model Uncertainty in
  Deep Learning
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Y. Gal
Zoubin Ghahramani
UQCV
BDL
287
9,156
0
06 Jun 2015
The Loss Surfaces of Multilayer Networks
The Loss Surfaces of Multilayer Networks
A. Choromańska
Mikael Henaff
Michaël Mathieu
Gerard Ben Arous
Yann LeCun
ODL
186
1,186
0
30 Nov 2014
1