ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.03107
57
15

Solving multiscale elliptic problems by sparse radial basis function neural networks

1 September 2023
Zhiwen Wang
Minxin Chen
Jingrun Chen
ArXivPDFHTML
Abstract

Machine learning has been successfully applied to various fields of scientific computing in recent years. In this work, we propose a sparse radial basis function neural network method to solve elliptic partial differential equations (PDEs) with multiscale coefficients. Inspired by the deep mixed residual method, we rewrite the second-order problem into a first-order system and employ multiple radial basis function neural networks (RBFNNs) to approximate unknown functions in the system. To aviod the overfitting due to the simplicity of RBFNN, an additional regularization is introduced in the loss function. Thus the loss function contains two parts: the L2L_2L2​ loss for the residual of the first-order system and boundary conditions, and the ℓ1\ell_1ℓ1​ regularization term for the weights of radial basis functions (RBFs). An algorithm for optimizing the specific loss function is introduced to accelerate the training process. The accuracy and effectiveness of the proposed method are demonstrated through a collection of multiscale problems with scale separation, discontinuity and multiple scales from one to three dimensions. Notably, the ℓ1\ell_1ℓ1​ regularization can achieve the goal of representing the solution by fewer RBFs. As a consequence, the total number of RBFs scales like O(ε−nτ)\mathcal{O}(\varepsilon^{-n\tau})O(ε−nτ), where ε\varepsilonε is the smallest scale, nnn is the dimensionality, and τ\tauτ is typically smaller than 111. It is worth mentioning that the proposed method not only has the numerical convergence and thus provides a reliable numerical solution in three dimensions when a classical method is typically not affordable, but also outperforms most other available machine learning methods in terms of accuracy and robustness.

View on arXiv
Comments on this paper