58
v1v2v3 (latest)

Coupled Entropy: A Goldilocks Generalization for Complex Systems

Main:10 Pages
2 Figures
Bibliography:2 Pages
Appendix:2 Pages
Abstract

The coupled entropy is proven to correct a flaw in the derivation of the Tsallis entropy and thereby solidify the theoretical foundations for analyzing the uncertainty of complex systems. The Tsallis entropy originated from considering power probabilities piqp_i^q in which \textit{q} independent, identically-distributed random variables share the same state. The maximum entropy distribution was derived to be a \textit{q}-exponential, which is a member of the shape (κ\kappa), scale (σ\sigma) distributions. Unfortunately, the qq-exponential parameters were treated as though valid substitutes for the shape and scale. This flaw causes a misinterpretation of the generalized temperature and an imprecise derivation of the generalized entropy. The coupled entropy is derived from the generalized Pareto distribution (GPD) and the Student's t distribution, whose shape derives from nonlinear sources and scale derives from linear sources of uncertainty. The Tsallis entropy of the GPD converges to one as κ\kappa\rightarrow\infty, which makes it too cold. The normalized Tsallis entropy (NTE) introduces a nonlinear term multiplying the scale and the coupling, making it too hot. The coupled entropy provides perfect balance, ranging from lnσ\ln \sigma for κ=0\kappa=0 to σ\sigma as κ\kappa\rightarrow\infty. One could say, the coupled entropy allows scientists, engineers, and analysts to eat their porridge, confident that its measure of uncertainty reflects the mathematical physics of the scale of non-exponential distributions while minimizing the dependence on the shape or nonlinear coupling. Examples of complex systems design including a coupled variation inference algorithm are reviewed.

View on arXiv
Comments on this paper