ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2102.05485
  4. Cited By
On the Properties of Kullback-Leibler Divergence Between Multivariate
  Gaussian Distributions

On the Properties of Kullback-Leibler Divergence Between Multivariate Gaussian Distributions

10 February 2021
Yufeng Zhang
Wanwei Liu
Zhenbang Chen
Ji Wang
KenLi Li
ArXivPDFHTML

Papers citing "On the Properties of Kullback-Leibler Divergence Between Multivariate Gaussian Distributions"

2 / 2 papers shown
Title
Fast approximations of the Jeffreys divergence between univariate
  Gaussian mixture models via exponential polynomial densities
Fast approximations of the Jeffreys divergence between univariate Gaussian mixture models via exponential polynomial densities
Frank Nielsen
30
12
0
13 Jul 2021
Are generative deep models for novelty detection truly better?
Are generative deep models for novelty detection truly better?
V. Škvára
Tomás Pevný
Václav Smídl
31
38
0
13 Jul 2018
1