117

Pseudodata-guided Invariant Representation Learning Boosts the Out-of-Distribution Generalization in Enzymatic Kinetic Parameter Prediction

Haomin Wu
Zhiwei Nie
Hongyu Zhang
Zhixiang Ren
Main:13 Pages
1 Figures
Bibliography:3 Pages
6 Tables
Appendix:4 Pages
Abstract

Accurate prediction of enzyme kinetic parameters is essential for understanding catalytic mechanisms and guiding enzymethis http URL, existing deep learning-based enzyme-substrate interaction (ESI) predictors often exhibit performance degradation on sequence-divergent, out-of-distribution (OOD) cases, limiting robustness under biologically relevantthis http URLpropose O2^2DENet, a lightweight, plug-and-play module that enhances OOD generalization via biologically and chemically informed perturbation augmentation and invariant representation learning.O2^2DENet introduces enzyme-substrate perturbations and enforces consistency between original and augmented enzyme-substrate-pair representations to encourage invariance to distributionalthis http URLintegrated with representative ESI models, O2^2DENet consistently improves predictive performance for both kcatk_{cat} and KmK_m across stringent sequence-identity-based OOD benchmarks, achieving state-of-the-art results among the evaluated methods in terms of accuracy and robustnessthis http URL, O2^2DENet provides a general and effective strategy to enhance the stability and deployability of data-driven enzyme kinetics predictors for real-world enzyme engineering applications.

View on arXiv
Comments on this paper