Tsallis and Rényi deformations linked via a new -duality

Tsallis and R\'{e}nyi entropies, which are monotone transformations of each other, are deformations of the celebrated Shannon entropy. Maximization of these deformed entropies, under suitable constraints, leads to the -exponential family which has applications in non-extensive statistical physics, information theory and statistics. In previous information-geometric studies, the -exponential family was analyzed using classical convex duality and Bregman divergence. In this paper, we show that a generalized -duality, where is the constant information-geometric curvature, leads to a generalized exponential family which is essentially equivalent to the -exponential family and has deep connections with R\'{e}nyi entropy and optimal transport. Using this generalized convex duality and its associated logarithmic divergence, we show that our -exponential family satisfies properties that parallel and generalize those of the exponential family. Under our framework, the R\'{e}nyi entropy and divergence arise naturally, and we give a new proof of the Tsallis/R\'{e}nyi entropy maximizing property of the -exponential family. We also introduce a -mixture family which may be regarded as the dual of the -exponential family, and connect it with other mixture-type families. Finally, we discuss a duality between the -exponential family and the -logarithmic divergence, and study its statistical consequences.
View on arXiv