Coupled Entropy: A Goldilocks Generalization?

Nonextensive Statistical Mechanics (NSM) has developed into a powerful toolset for modeling and analyzing complex systems. Despite its many successes, a puzzle arose early in its development. The constraints on the Tsallis entropy are in the form of an escort distribution with elements proportional to , but this same factor within the Tsallis entropy function is not normalized. This led to consideration of the Normalized Tsallis Entropy (NTE); however, the normalization proved to make the function unstable. I will provide evidence that the coupled entropy, which divides NTE by , where is the dimension and is the coupling, may provide the necessary robustness necessary for applications like machine learning. The definition for the coupled entropy and its maximizing distributions, the coupled exponential family, arises from clarifying how the number of independent random variables is composed of the nonlinear properties of complex systems, , where is the nonlinear parameter governing the shape of distributions near their location and is the parameter determining the asymptotic tail decay. Foundationally, for complex systems, the coupling is the measure of nonlinearity inducing non-exponential distributions and the degree of nonadditivity entropy. As such, the coupling is a strong candidate as a measure of statistical complexity.
View on arXiv