58
0

Sub-Sequential Physics-Informed Learning with State Space Model

Abstract

Physics-Informed Neural Networks (PINNs) are a kind of deep-learning-based numerical solvers for partial differential equations (PDEs). Existing PINNs often suffer from failure modes of being unable to propagate patterns of initial conditions. We discover that these failure modes are caused by the simplicity bias of neural networks and the mismatch between PDE's continuity and PINN's discrete sampling. We reveal that the State Space Model (SSM) can be a continuous-discrete articulation allowing initial condition propagation, and that simplicity bias can be eliminated by aligning a sequence of moderate granularity. Accordingly, we propose PINNMamba, a novel framework that introduces sub-sequence modeling with SSM. Experimental results show that PINNMamba can reduce errors by up to 86.3\% compared with state-of-the-art architecture. Our code is available atthis https URL.

View on arXiv
@article{xu2025_2502.00318,
  title={ Sub-Sequential Physics-Informed Learning with State Space Model },
  author={ Chenhui Xu and Dancheng Liu and Yuting Hu and Jiajie Li and Ruiyang Qin and Qingxiao Zheng and Jinjun Xiong },
  journal={arXiv preprint arXiv:2502.00318},
  year={ 2025 }
}
Comments on this paper