62

Coupled Entropy: A Goldilocks Generalization?

Main:10 Pages
2 Figures
Bibliography:2 Pages
Appendix:2 Pages
Abstract

Nonextensive Statistical Mechanics (NSM) has developed into a powerful toolset for modeling and analyzing complex systems. Despite its many successes, a puzzle arose early in its development. The constraints on the Tsallis entropy are in the form of an escort distribution with elements proportional to piqp_i^q, but this same factor within the Tsallis entropy function is not normalized. This led to consideration of the Normalized Tsallis Entropy (NTE); however, the normalization proved to make the function unstable. I will provide evidence that the coupled entropy, which divides NTE by 1+dκ1 + d\kappa, where dd is the dimension and κ\kappa is the coupling, may provide the necessary robustness necessary for applications like machine learning. The definition for the coupled entropy and its maximizing distributions, the coupled exponential family, arises from clarifying how the number of independent random variables (q)(q) is composed of the nonlinear properties of complex systems, q=1+ακ1+dκq=1+\frac{\alpha\kappa}{1+d\kappa}, where α\alpha is the nonlinear parameter governing the shape of distributions near their location and κ\kappa is the parameter determining the asymptotic tail decay. Foundationally, for complex systems, the coupling is the measure of nonlinearity inducing non-exponential distributions and the degree of nonadditivity entropy. As such, the coupling is a strong candidate as a measure of statistical complexity.

View on arXiv
Comments on this paper