Coupled Entropy: A Goldilocks Generalization for Complex Systems

The coupled entropy is proven to correct a flaw in the derivation of the Tsallis entropy and thereby solidify the theoretical foundations for analyzing the uncertainty of complex systems. The Tsallis entropy originated from considering power probabilities in which \textit{q} independent, identically-distributed random variables share the same state. The maximum entropy distribution was derived to be a \textit{q}-exponential, which is a member of the shape (), scale () distributions. Unfortunately, the -exponential parameters were treated as though valid substitutes for the shape and scale. This flaw causes a misinterpretation of the generalized temperature and an imprecise derivation of the generalized entropy. The coupled entropy is derived from the generalized Pareto distribution (GPD) and the Student's t distribution, whose shape derives from nonlinear sources and scale derives from linear sources of uncertainty. The Tsallis entropy of the GPD converges to one as , which makes it too cold. The normalized Tsallis entropy (NTE) introduces a nonlinear term multiplying the scale and the coupling, making it too hot. The coupled entropy provides perfect balance, ranging from for to as . One could say, the coupled entropy allows scientists, engineers, and analysts to eat their porridge, confident that its measure of uncertainty reflects the mathematical physics of the scale of non-exponential distributions while minimizing the dependence on the shape or nonlinear coupling. Examples of complex systems design including a coupled variation inference algorithm are reviewed.
View on arXiv