ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.01861
58
0

Learning Hyperparameters via a Data-Emphasized Variational Objective

3 February 2025
Ethan Harvey
Mikhail Petrov
Michael C. Hughes
ArXivPDFHTML
Abstract

When training large flexible models, practitioners often rely on grid search to select hyperparameters that control over-fitting. This grid search has several disadvantages: the search is computationally expensive, requires carving out a validation set that reduces the available data for training, and requires users to specify candidate values. In this paper, we propose an alternative: directly learning regularization hyperparameters on the full training set via the evidence lower bound ("ELBo") objective from variational methods. For deep neural networks with millions of parameters, we recommend a modified ELBo that upweights the influence of the data likelihood relative to the prior. Our proposed technique overcomes all three disadvantages of grid search. In a case study on transfer learning of image classifiers, we show how our method reduces the 88+ hour grid search of past work to under 3 hours while delivering comparable accuracy. We further demonstrate how our approach enables efficient yet accurate approximations of Gaussian processes with learnable length-scale kernels.

View on arXiv
@article{harvey2025_2502.01861,
  title={ Learning Hyperparameters via a Data-Emphasized Variational Objective },
  author={ Ethan Harvey and Mikhail Petrov and Michael C. Hughes },
  journal={arXiv preprint arXiv:2502.01861},
  year={ 2025 }
}
Comments on this paper