Symmetry-Breaking Descent for Invariant Cost Functionals

We study the problem of reducing a task cost functional , not assumed continuous or differentiable, defined over Sobolev-class signals $S \in H^s(M) $, in the presence of a global symmetry group . The group acts on signals by pullback, and the cost is invariant under this action. Such scenarios arise in machine learning and related optimization tasks, where performance metrics may be discontinuous or model-internal.We propose a variational method that exploits the symmetry structure to construct explicit deformations of the input signal. A deformation control field $ \phi: M \to \mathbb R^d$, obtained by minimizing an auxiliary energy functional, induces a flow that generically lies in the normal space (with respect to the inner product) to the -orbit of , and hence is a natural candidate to cross the decision boundary of the $G $-invariant cost.We analyze two variants of the coupling term: (1) purely geometric, independent of , and (2) weakly coupled to . Under mild conditions, we show that symmetry-breaking deformations of the signal can reduce the cost.Our approach requires no gradient backpropagation or training labels and operates entirely at test time. It provides a principled tool for optimizing discontinuous invariant cost functionals via Lie-algebraic variational flows.
View on arXiv