Divergence-free symmetric tensors (DFSTs) are fundamental in continuum mechanics, encoding conservation laws such as mass and momentum conservation. We introduce Riemann Tensor Neural Networks (RTNNs), a novel neural architecture that inherently satisfies the DFST condition to machine precision, providing a strong inductive bias for enforcing these conservation laws. We prove that RTNNs can approximate any sufficiently smooth DFST with arbitrary precision and demonstrate their effectiveness as surrogates for conservative PDEs, achieving improved accuracy across benchmarks. This work is the first to use DFSTs as an inductive bias in neural PDE surrogates and to explicitly enforce the conservation of both mass and momentum within a physics-constrained neural architecture.
View on arXiv@article{jnini2025_2503.00755, title={ Riemann Tensor Neural Networks: Learning Conservative Systems with Physics-Constrained Networks }, author={ Anas Jnini and Lorenzo Breschi and Flavio Vella }, journal={arXiv preprint arXiv:2503.00755}, year={ 2025 } }