ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2304.12666
  4. Cited By
Bayesian Optimization Meets Self-Distillation

Bayesian Optimization Meets Self-Distillation

25 April 2023
HyunJae Lee
Heon Song
Hyeonsoo Lee
Gi-hyeon Lee
Suyeong Park
Donggeun Yoo
    UQCV
    BDL
ArXivPDFHTML

Papers citing "Bayesian Optimization Meets Self-Distillation"

4 / 4 papers shown
Title
Improving Multi-fidelity Optimization with a Recurring Learning Rate for
  Hyperparameter Tuning
Improving Multi-fidelity Optimization with a Recurring Learning Rate for Hyperparameter Tuning
HyunJae Lee
Gihyeon Lee
Junh-Nam Kim
Sungjun Cho
Dohyun Kim
Donggeun Yoo
26
3
0
26 Sep 2022
SMAC3: A Versatile Bayesian Optimization Package for Hyperparameter
  Optimization
SMAC3: A Versatile Bayesian Optimization Package for Hyperparameter Optimization
Marius Lindauer
Katharina Eggensperger
Matthias Feurer
André Biedenkapp
Difan Deng
C. Benjamins
Tim Ruhopf
René Sass
Frank Hutter
85
326
0
20 Sep 2021
There Are Many Consistent Explanations of Unlabeled Data: Why You Should
  Average
There Are Many Consistent Explanations of Unlabeled Data: Why You Should Average
Ben Athiwaratkun
Marc Finzi
Pavel Izmailov
A. Wilson
199
243
0
14 Jun 2018
ImageNet Large Scale Visual Recognition Challenge
ImageNet Large Scale Visual Recognition Challenge
Olga Russakovsky
Jia Deng
Hao Su
J. Krause
S. Satheesh
...
A. Karpathy
A. Khosla
Michael S. Bernstein
Alexander C. Berg
Li Fei-Fei
VLM
ObjD
296
39,194
0
01 Sep 2014
1