188
v1v2 (latest)

Symmetry-Breaking Descent for Invariant Cost Functionals

Main:17 Pages
Bibliography:3 Pages
Appendix:6 Pages
Abstract

We study the problem of reducing a task cost functional W:Hs(M)RW : H^s(M) \to \mathbb{R}, not assumed continuous or differentiable, defined over Sobolev-class signals $S \in H^s(M) $, in the presence of a global symmetry group GDiff(M)G \subset \mathrm{Diff}(M). The group acts on signals by pullback, and the cost WW is invariant under this action. Such scenarios arise in machine learning and related optimization tasks, where performance metrics may be discontinuous or model-internal.We propose a variational method that exploits the symmetry structure to construct explicit deformations of the input signal. A deformation control field $ \phi: M \to \mathbb R^d$, obtained by minimizing an auxiliary energy functional, induces a flow that generically lies in the normal space (with respect to the L2L^2 inner product) to the GG-orbit of SS, and hence is a natural candidate to cross the decision boundary of the $G $-invariant cost.We analyze two variants of the coupling term: (1) purely geometric, independent of WW, and (2) weakly coupled to WW. Under mild conditions, we show that symmetry-breaking deformations of the signal can reduce the cost.Our approach requires no gradient backpropagation or training labels and operates entirely at test time. It provides a principled tool for optimizing discontinuous invariant cost functionals via Lie-algebraic variational flows.

View on arXiv
Comments on this paper