ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2011.01170
  4. Cited By
Homeomorphic-Invariance of EM: Non-Asymptotic Convergence in KL
  Divergence for Exponential Families via Mirror Descent

Homeomorphic-Invariance of EM: Non-Asymptotic Convergence in KL Divergence for Exponential Families via Mirror Descent

2 November 2020
Frederik Kunstner
Raunak Kumar
Mark W. Schmidt
ArXivPDFHTML

Papers citing "Homeomorphic-Invariance of EM: Non-Asymptotic Convergence in KL Divergence for Exponential Families via Mirror Descent"

3 / 3 papers shown
Title
Federated Expectation Maximization with heterogeneity mitigation and
  variance reduction
Federated Expectation Maximization with heterogeneity mitigation and variance reduction
Aymeric Dieuleveut
G. Fort
Eric Moulines
Geneviève Robin
FedML
23
5
0
03 Nov 2021
The Bayesian Learning Rule
The Bayesian Learning Rule
Mohammad Emtiyaz Khan
Håvard Rue
BDL
48
73
0
09 Jul 2021
Incremental Majorization-Minimization Optimization with Application to
  Large-Scale Machine Learning
Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning
Julien Mairal
60
317
0
18 Feb 2014
1