32
0

Precise Gradient Discontinuities in Neural Fields for Subspace Physics

Main:12 Pages
15 Figures
Bibliography:3 Pages
1 Tables
Appendix:6 Pages
Abstract

Discontinuities in spatial derivatives appear in a wide range of physical systems, from creased thin sheets to materials with sharp stiffness transitions. Accurately modeling these features is essential for simulation but remains challenging for traditional mesh-based methods, which require discontinuity-aligned remeshing -- entangling geometry with simulation and hindering generalization across shape families.Neural fields offer an appealing alternative by encoding basis functions as smooth, continuous functions over space, enabling simulation across varying shapes. However, their smoothness makes them poorly suited for representing gradient discontinuities. Prior work addresses discontinuities in function values, but capturing sharp changes in spatial derivatives while maintaining function continuity has received little attention.We introduce a neural field construction that captures gradient discontinuities without baking their location into the network weights. By augmenting input coordinates with a smoothly clamped distance function in a lifting framework, we enable encoding of gradient jumps at evolving interfaces.This design supports discretization-agnostic simulation of parametrized shape families with heterogeneous materials and evolving creases, enabling new reduced-order capabilities such as shape morphing, interactive crease editing, and simulation of soft-rigid hybrid structures. We further demonstrate that our method can be combined with previous lifting techniques to jointly capture both gradient and value discontinuities, supporting simultaneous cuts and creases within a unified model.

View on arXiv
@article{liu2025_2505.20421,
  title={ Precise Gradient Discontinuities in Neural Fields for Subspace Physics },
  author={ Mengfei Liu and Yue Chang and Zhecheng Wang and Peter Yichen Chen and Eitan Grinspun },
  journal={arXiv preprint arXiv:2505.20421},
  year={ 2025 }
}
Comments on this paper