Large neural networks excel at prediction tasks, but their application to design problems, such as protein engineering or materials discovery, requires solving offline model-based optimization (MBO) problems. While predictive models may not directly translate to effective design, recent MBO algorithms incorporate reinforcement learning and generative modeling approaches. Meanwhile, theoretical work suggests that exploiting the target function's structure can enhance MBO performance. We present Cliqueformer, a transformer-based architecture that learns the black-box function's structure through functional graphical models (FGM), addressing distribution shift without relying on explicit conservative approaches. Across various domains, including chemical and genetic design tasks, Cliqueformer demonstrates superior performance compared to existing methods.
View on arXiv@article{kuba2025_2410.13106, title={ Cliqueformer: Model-Based Optimization with Structured Transformers }, author={ Jakub Grudzien Kuba and Pieter Abbeel and Sergey Levine }, journal={arXiv preprint arXiv:2410.13106}, year={ 2025 } }