ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1905.00586
  4. Cited By
Estimating Kullback-Leibler Divergence Using Kernel Machines
v1v2 (latest)

Estimating Kullback-Leibler Divergence Using Kernel Machines

2 May 2019
Kartik Ahuja
ArXiv (abs)PDFHTML

Papers citing "Estimating Kullback-Leibler Divergence Using Kernel Machines"

4 / 4 papers shown
Title
The Conditional Cauchy-Schwarz Divergence with Applications to Time-Series Data and Sequential Decision Making
The Conditional Cauchy-Schwarz Divergence with Applications to Time-Series Data and Sequential Decision Making
Shujian Yu
Hongming Li
Sigurd Løkse
Robert Jenssen
José C. Príncipe
BDL
112
6
0
21 Jan 2023
Reliable Estimation of KL Divergence using a Discriminator in
  Reproducing Kernel Hilbert Space
Reliable Estimation of KL Divergence using a Discriminator in Reproducing Kernel Hilbert Space
S. Ghimire
A. Masoomi
Jennifer Dy
52
8
0
29 Sep 2021
Neural Joint Entropy Estimation
Neural Joint Entropy Estimation
Yuval Shalev
Amichai Painsky
I. Ben-Gal
90
8
0
21 Dec 2020
Reducing the Variance of Variational Estimates of Mutual Information by
  Limiting the Critic's Hypothesis Space to RKHS
Reducing the Variance of Variational Estimates of Mutual Information by Limiting the Critic's Hypothesis Space to RKHS
P. A. Sreekar
Ujjwal Tiwari
A. Namboodiri
39
2
0
17 Nov 2020
1