Eigenanalysis of differential operators, such as the Laplace operator or elastic energy Hessian, is typically restricted to a single shape and its discretization, limiting reduced order modeling (ROM). We introduce the first eigenanalysis method for continuously parameterized shape families. Given a parametric shape, our method constructs spatial neural fields that represent eigenfunctions across the entire shape space. It is agnostic to the specific shape representation, requiring only an inside/outside indicator function that depends on shape parameters. Eigenfunctions are computed by minimizing a variational principle over nested spaces with orthogonality constraints. Since eigenvalues may swap dominance at points of multiplicity, we jointly train multiple eigenfunctions while dynamically reordering them based on their eigenvalues at each step. Through causal gradient filtering, this reordering is reflected in backpropagation. Our method enables applications to operate over shape space, providing a single ROM that encapsulates vibration modes for all shapes, including previously unseen ones. Since our eigenanalysis is differentiable with respect to shape parameters, it facilitates eigenfunction-aware shape optimization. We evaluate our approach on shape optimization for sound synthesis and locomotion, as well as reduced-order modeling for elastodynamic simulation.
View on arXiv@article{chang2025_2408.10099, title={ Shape Space Spectra }, author={ Yue Chang and Otman Benchekroun and Maurizio M. Chiaramonte and Peter Yichen Chen and Eitan Grinspun }, journal={arXiv preprint arXiv:2408.10099}, year={ 2025 } }