559
v1v2v3 (latest)

Generating synthetic data for neural operators

SMAI Journal of Computational Mathematics (SMAI-JCM), 2024
Main:12 Pages
11 Figures
Bibliography:3 Pages
4 Tables
Appendix:2 Pages
Abstract

Recent advances in the literature show promising potential of deep learning methods, particularly neural operators, in obtaining numerical solutions to partial differential equations (PDEs) beyond the reach of current numerical solvers. However, existing data-driven approaches often rely on training data produced by numerical PDE solvers (e.g., finite difference or finite element methods). We introduce a "backward" data generation method that avoids solving the PDE numerically: by randomly sampling candidate solutions uju_j from the appropriate solution space (e.g., H01(Ω)H_0^1(\Omega)), we compute the corresponding right-hand side fjf_j directly from the equation by differentiation. This produces training pairs (fj,uj){(f_j, u_j)} by computing derivatives rather than solving a PDE numerically for each data point, enabling fast, large-scale data generation consisting of exact solutions. Experiments indicate that models trained on this synthetic data generalize well when tested on data produced by standard solvers. While the idea is simple, we hope this method will expand the potential of neural PDE solvers that do not rely on classical numerical solvers to generate their data.

View on arXiv
Comments on this paper