38

Sparsity is Combinatorial Depth: Quantifying MoE Expressivity via Tropical Geometry

Ye Su
Huayi Tang
Zixuan Gong
Yong Liu
Main:20 Pages
2 Figures
Bibliography:4 Pages
1 Tables
Abstract

While Mixture-of-Experts (MoE) architectures define the state-of-the-art, their theoretical success is often attributed to heuristic efficiency rather than geometric expressivity. In this work, we present the first analysis of MoE through the lens of tropical geometry, establishing that the Top-kk routing mechanism is algebraically isomorphic to the kk-th elementary symmetric tropical polynomial. This isomorphism partitions the input space into the Normal Fan of a Hypersimplex, revealing that \textbf{sparsity is combinatorial depth} which scales geometric capacity by the binomial coefficient (Nk)\binom{N}{k}. Moving beyond ambient bounds, we introduce the concept of \textit{Effective Capacity} under the Manifold Hypothesis. We prove that while dense networks suffer from capacity collapse on low-dimensional data, MoE architectures exhibit \textit{Combinatorial Resilience}, maintaining high expressivity via the transversality of routing cones. In this study, our framework unifies the discrete geometry of the Hypersimplex with the continuous geometry of neural functions, offering a rigorous theoretical justification for the topological supremacy of conditional computation.

View on arXiv
Comments on this paper