ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.10141
27
0

The Impact of Model Zoo Size and Composition on Weight Space Learning

14 April 2025
Damian Falk
Konstantin Schurholt
Damian Borth
ArXivPDFHTML
Abstract

Re-using trained neural network models is a common strategy to reduce training cost and transfer knowledge. Weight space learning - using the weights of trained models as data modality - is a promising new field to re-use populations of pre-trained models for future tasks. Approaches in this field have demonstrated high performance both on model analysis and weight generation tasks. However, until now their learning setup requires homogeneous model zoos where all models share the same exact architecture, limiting their capability to generalize beyond the population of models they saw during training. In this work, we remove this constraint and propose a modification to a common weight space learning method to accommodate training on heterogeneous populations of models. We further investigate the resulting impact of model diversity on generating unseen neural network model weights for zero-shot knowledge transfer. Our extensive experimental evaluation shows that including models with varying underlying image datasets has a high impact on performance and generalization, for both in- and out-of-distribution settings. Code is available onthis http URL.

View on arXiv
@article{falk2025_2504.10141,
  title={ The Impact of Model Zoo Size and Composition on Weight Space Learning },
  author={ Damian Falk and Konstantin Schürholt and Damian Borth },
  journal={arXiv preprint arXiv:2504.10141},
  year={ 2025 }
}
Comments on this paper