79
v1v2 (latest)

The Price equation reveals a universal force-metric-bias law of algorithmic learning and natural selection

2 Figures
Appendix:38 Pages
Abstract

Diverse learning algorithms, optimization methods, and natural selection share a common mathematical structure, despite their apparent differences. Here I show that a simple notational partitioning of change by the Price equation reveals a universal force-metric-bias (FMB) law: Δθ=Mf+b+ξ\Delta\mathbf{\theta} = \mathbf{M}\,\mathbf{f} + \mathbf{b} + \mathbf{\xi}. The force f\mathbf{f} drives improvement in parameters, Δθ\Delta\mathbf{\theta}, in proportion to the slope of performance with respect to the parameters. The metric M\mathbf{M} rescales movement by inverse curvature. The bias b\mathbf{b} adds momentum or changes in the frame of reference. The noise ξ\mathbf{\xi} enables exploration. This framework unifies natural selection, Bayesian updating, Newton's method, stochastic gradient descent, stochastic Langevin dynamics, Adam optimization, and most other algorithms as special cases of the same underlying process. The Price equation also reveals why Fisher information, Kullback-Leibler divergence, and dÁlembert's principle arise naturally in learning dynamics. By exposing this common structure, the FMB law provides a principled foundation for understanding, comparing, and designing learning algorithms across disciplines.

View on arXiv
Comments on this paper