12
0

Tsallis and Rényi deformations linked via a new λλ-duality

Ting-Kam Leonard Wong
Jun Zhang
Abstract

Tsallis and R\'{e}nyi entropies, which are monotone transformations of each other, are deformations of the celebrated Shannon entropy. Maximization of these deformed entropies, under suitable constraints, leads to the qq-exponential family which has applications in non-extensive statistical physics, information theory and statistics. In previous information-geometric studies, the qq-exponential family was analyzed using classical convex duality and Bregman divergence. In this paper, we show that a generalized λ\lambda-duality, where λ=1q\lambda = 1 - q is the constant information-geometric curvature, leads to a generalized exponential family which is essentially equivalent to the qq-exponential family and has deep connections with R\'{e}nyi entropy and optimal transport. Using this generalized convex duality and its associated logarithmic divergence, we show that our λ\lambda-exponential family satisfies properties that parallel and generalize those of the exponential family. Under our framework, the R\'{e}nyi entropy and divergence arise naturally, and we give a new proof of the Tsallis/R\'{e}nyi entropy maximizing property of the qq-exponential family. We also introduce a λ\lambda-mixture family which may be regarded as the dual of the λ\lambda-exponential family, and connect it with other mixture-type families. Finally, we discuss a duality between the λ\lambda-exponential family and the λ\lambda-logarithmic divergence, and study its statistical consequences.

View on arXiv
Comments on this paper