12

Interpretable Analytic Calabi-Yau Metrics via Symbolic Distillation

D Yang Eng
Main:31 Pages
7 Figures
17 Tables
Abstract

Calabi--Yau manifolds are essential for string theory but require computing intractable metrics. Here we show that symbolic regression can distill neural approximations into simple, interpretable formulas. Our five-term expression matches neural accuracy (R2=0.9994R^2 = 0.9994) with 3,000-fold fewer parameters. Multi-seed validation confirms that geometric constraints select essential features, specifically power sums and symmetric polynomials, while permitting structural diversity. The functional form can be maintained across the studied moduli range (ψ[0,0.8]\psi \in [0, 0.8]) with coefficients varying smoothly; we interpret these trends as empirical hypotheses within the accuracy regime of the locally-trained teachers (σ89%\sigma \approx 8-9\% at ψ0\psi \neq 0). The formula reproduces physical observables -- volume integrals and Yukawa couplings -- validating that symbolic distillation recovers compact, interpretable models for quantities previously accessible only to black-box networks.

View on arXiv
Comments on this paper