ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.00924
29
0

PABBO: Preferential Amortized Black-Box Optimization

2 March 2025
Xinyu Zhang
Daolang Huang
Samuel Kaski
Julien Martinelli
ArXivPDFHTML
Abstract

Preferential Bayesian Optimization (PBO) is a sample-efficient method to learn latent user utilities from preferential feedback over a pair of designs. It relies on a statistical surrogate model for the latent function, usually a Gaussian process, and an acquisition strategy to select the next candidate pair to get user feedback on. Due to the non-conjugacy of the associated likelihood, every PBO step requires a significant amount of computations with various approximate inference techniques. This computational overhead is incompatible with the way humans interact with computers, hindering the use of PBO in real-world cases. Building on the recent advances of amortized BO, we propose to circumvent this issue by fully amortizing PBO, meta-learning both the surrogate and the acquisition function. Our method comprises a novel transformer neural process architecture, trained using reinforcement learning and tailored auxiliary losses. On a benchmark composed of synthetic and real-world datasets, our method is several orders of magnitude faster than the usual Gaussian process-based strategies and often outperforms them in accuracy.

View on arXiv
@article{zhang2025_2503.00924,
  title={ PABBO: Preferential Amortized Black-Box Optimization },
  author={ Xinyu Zhang and Daolang Huang and Samuel Kaski and Julien Martinelli },
  journal={arXiv preprint arXiv:2503.00924},
  year={ 2025 }
}
Comments on this paper