ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2304.13718
  4. Cited By
Sparsified Model Zoo Twins: Investigating Populations of Sparsified
  Neural Network Models

Sparsified Model Zoo Twins: Investigating Populations of Sparsified Neural Network Models

26 April 2023
D. Honegger
Konstantin Schurholt
Damian Borth
ArXivPDFHTML

Papers citing "Sparsified Model Zoo Twins: Investigating Populations of Sparsified Neural Network Models"

10 / 10 papers shown
Title
A Model Zoo of Vision Transformers
A Model Zoo of Vision Transformers
Damian Falk
Léo Meynent
Florence Pfammatter
Konstantin Schurholt
Damian Borth
29
0
0
14 Apr 2025
Model Zoos: A Dataset of Diverse Populations of Neural Network Models
Model Zoos: A Dataset of Diverse Populations of Neural Network Models
Konstantin Schurholt
Diyar Taskiran
Boris Knyazev
Xavier Giró-i-Nieto
Damian Borth
41
29
0
29 Sep 2022
Hyper-Representations as Generative Models: Sampling Unseen Neural
  Network Weights
Hyper-Representations as Generative Models: Sampling Unseen Neural Network Weights
Konstantin Schurholt
Boris Knyazev
Xavier Giró-i-Nieto
Damian Borth
48
38
0
29 Sep 2022
Learning to Learn with Generative Models of Neural Network Checkpoints
Learning to Learn with Generative Models of Neural Network Checkpoints
William S. Peebles
Ilija Radosavovic
Tim Brooks
Alexei A. Efros
Jitendra Malik
UQCV
64
64
0
26 Sep 2022
Git Re-Basin: Merging Models modulo Permutation Symmetries
Git Re-Basin: Merging Models modulo Permutation Symmetries
Samuel K. Ainsworth
J. Hayase
S. Srinivasa
MoMe
239
313
0
11 Sep 2022
Compact and Optimal Deep Learning with Recurrent Parameter Generators
Compact and Optimal Deep Learning with Recurrent Parameter Generators
Jiayun Wang
Yubei Chen
Stella X. Yu
Brian Cheung
Yann LeCun
BDL
24
4
0
15 Jul 2021
Sparsity in Deep Learning: Pruning and growth for efficient inference
  and training in neural networks
Sparsity in Deep Learning: Pruning and growth for efficient inference and training in neural networks
Torsten Hoefler
Dan Alistarh
Tal Ben-Nun
Nikoli Dryden
Alexandra Peste
MQ
128
679
0
31 Jan 2021
What is the State of Neural Network Pruning?
What is the State of Neural Network Pruning?
Davis W. Blalock
Jose Javier Gonzalez Ortiz
Jonathan Frankle
John Guttag
172
1,018
0
06 Mar 2020
Scaling Laws for Neural Language Models
Scaling Laws for Neural Language Models
Jared Kaplan
Sam McCandlish
T. Henighan
Tom B. Brown
B. Chess
R. Child
Scott Gray
Alec Radford
Jeff Wu
Dario Amodei
220
3,054
0
23 Jan 2020
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision
  Applications
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
948
20,214
0
17 Apr 2017
1