ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.04823
35
0

Guide your favorite protein sequence generative model

7 May 2025
Junhao Xiong
Hunter Nisonoff
Ishan Gaur
Jennifer Listgarten
    DiffM
ArXivPDFHTML
Abstract

Generative machine learning models have begun to transform protein engineering, yet no principled framework for conditioning on auxiliary information in a plug-and-play manner exists; one may want to iteratively incorporate experimental feedback, or make use of an existing classifier -- such as for predicting enzyme commission number -- in order to guide the sampling of the generative model to generate sequences with desired properties. Herein, we present ProteinGuide, a rigorous and general framework to achieve just that: through unifying a broad class of protein generative models that includes masked language, (order-agnostic) autoregressive, diffusion and flow-matching models, we provide an approach to statistically condition pre-trained protein generative models. We demonstrate applicability of our approach by guiding each of two commonly used protein generative models, ProteinMPNN and ESM3, to generate amino acid and structure token sequences conditioned on several user-specified properties, namely, enhanced stability and CATH-labeled fold generation.

View on arXiv
@article{xiong2025_2505.04823,
  title={ Guide your favorite protein sequence generative model },
  author={ Junhao Xiong and Hunter Nisonoff and Ishan Gaur and Jennifer Listgarten },
  journal={arXiv preprint arXiv:2505.04823},
  year={ 2025 }
}
Comments on this paper