ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.11811
24
0

Manifold meta-learning for reduced-complexity neural system identification

16 April 2025
Marco Forgione
Ankush Chakrabarty
Dario Piga
Matteo Rufolo
Alberto Bemporad
ArXivPDFHTML
Abstract

System identification has greatly benefited from deep learning techniques, particularly for modeling complex, nonlinear dynamical systems with partially unknown physics where traditional approaches may not be feasible. However, deep learning models often require large datasets and significant computational resources at training and inference due to their high-dimensional parameterizations. To address this challenge, we propose a meta-learning framework that discovers a low-dimensional manifold within the parameter space of an over-parameterized neural network architecture. This manifold is learned from a meta-dataset of input-output sequences generated by a class of related dynamical systems, enabling efficient model training while preserving the network's expressive power for the considered system class. Unlike bilevel meta-learning approaches, our method employs an auxiliary neural network to map datasets directly onto the learned manifold, eliminating the need for costly second-order gradient computations during meta-training and reducing the number of first-order updates required in inference, which could be expensive for large models. We validate our approach on a family of Bouc-Wen oscillators, which is a well-studied nonlinear system identification benchmark. We demonstrate that we are able to learn accurate models even in small-data scenarios.

View on arXiv
@article{forgione2025_2504.11811,
  title={ Manifold meta-learning for reduced-complexity neural system identification },
  author={ Marco Forgione and Ankush Chakrabarty and Dario Piga and Matteo Rufolo and Alberto Bemporad },
  journal={arXiv preprint arXiv:2504.11811},
  year={ 2025 }
}
Comments on this paper