89
1144

Hidden Physics Models: Machine Learning of Nonlinear Partial Differential Equations

Abstract

We introduce the concept of hidden physics models, which are essentially data-efficient learning machines capable of leveraging the underlying laws of physics, expressed by time dependent and nonlinear partial differential equations, to extract patterns from high-dimensional data generated from experiments. The proposed technology is applied to the problem of learning, system identification, or data-driven discovery of partial differential equations. The framework relies on Gaussian processes, a powerful tool for probabilistic inference over functions, to strike a balance between model complexity and data fit. The effectiveness of the proposed approach is demonstrated through a variety of canonical problems, spanning a number of scientific domains, including the Navier-Stokes, Schr\"{o}dinger, Kuramoto-Sivashinsky, and fractional equations. The methodology provides a promising new direction for capitalizing on the long-standing developments of classical methods in applied mathematics and mathematical physics to design learning machines with the ability to learn in complex domains without requiring large quantities of data.

View on arXiv
Comments on this paper