ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2312.01133
13
2

t3t^3t3-Variational Autoencoder: Learning Heavy-tailed Data with Student's t and Power Divergence

2 December 2023
Juno Kim
Jaehyuk Kwon
Mincheol Cho
Hyunjong Lee
Joong-Ho Won
ArXivPDFHTML
Abstract

The variational autoencoder (VAE) typically employs a standard normal prior as a regularizer for the probabilistic latent encoder. However, the Gaussian tail often decays too quickly to effectively accommodate the encoded points, failing to preserve crucial structures hidden in the data. In this paper, we explore the use of heavy-tailed models to combat over-regularization. Drawing upon insights from information geometry, we propose t3t^3t3VAE, a modified VAE framework that incorporates Student's t-distributions for the prior, encoder, and decoder. This results in a joint model distribution of a power form which we argue can better fit real-world datasets. We derive a new objective by reformulating the evidence lower bound as joint optimization of KL divergence between two statistical manifolds and replacing with γ\gammaγ-power divergence, a natural alternative for power families. t3t^3t3VAE demonstrates superior generation of low-density regions when trained on heavy-tailed synthetic data. Furthermore, we show that t3t^3t3VAE significantly outperforms other models on CelebA and imbalanced CIFAR-100 datasets.

View on arXiv
Comments on this paper