224

Symmetry-Breaking Descent for Invariant Cost Functionals

Main:17 Pages
Bibliography:3 Pages
Appendix:6 Pages
Abstract

We study the problem of reducing a task cost functional W(S)W(S), defined over Sobolev-class signals SS, when the cost is invariant under a global symmetry group GDiff(M)G \subset \mathrm{Diff}(M) and accessible only as a black-box. Such scenarios arise in machine learning, imaging, and inverse problems, where cost metrics reflect model outputs or performance scores but are non-differentiable and model-internal. We propose a variational method that exploits the symmetry structure to construct explicit, symmetry-breaking deformations of the input signal. A gauge field ϕ\phi, obtained by minimizing an auxiliary energy functional, induces a deformation h=Aϕ[S]h = A_\phi[S] that generically lies transverse to the GG-orbit of SS. We prove that, under mild regularity, the cost WW strictly decreases along this direction -- either via Clarke subdifferential descent or by escaping locally flat plateaus. The exceptional set of degeneracies has zero Gaussian measure. Our approach requires no access to model gradients or labels and operates entirely at test time. It provides a principled tool for optimizing invariant cost functionals via Lie-algebraic variational flows, with applications to black-box models and symmetry-constrained tasks.

View on arXiv
Comments on this paper