ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2101.12459
81
24
v1v2v3v4v5v6 (latest)

On fff-divergences between Cauchy distributions

29 January 2021
Frank Nielsen
K. Okamura
ArXiv (abs)PDFHTML
Abstract

We prove that the fff-divergences between univariate Cauchy distributions are all symmetric, and can be expressed as strictly increasing scalar functions of the symmetric chi-squared divergence. We report the corresponding scalar functions for the total variation distance, the Kullback-Leibler divergence, the squared Hellinger divergence, and the Jensen-Shannon divergence among others. Next, we give conditions to expand the fff-divergences as converging infinite series of higher-order power chi divergences, and illustrate the criterion for converging Taylor series expressing the fff-divergences between Cauchy distributions. We then show that the symmetric property of fff-divergences holds for multivariate location-scale families with prescribed matrix scales provided that the standard density is even which includes the cases of the multivariate normal and Cauchy families. However, the fff-divergences between multivariate Cauchy densities with different scale matrices are shown asymmetric. Finally, we present several metrizations of fff-divergences between univariate Cauchy distributions and further report geometric embedding properties of these metrics.

View on arXiv
Comments on this paper