ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.09611
81
0

Designing a Conditional Prior Distribution for Flow-Based Generative Models

13 February 2025
Noam Issachar
Mohammad Salama
Raanan Fattal
Sagie Benaim
ArXivPDFHTML
Abstract

Flow-based generative models have recently shown impressive performance for conditional generation tasks, such as text-to-image generation. However, current methods transform a general unimodal noise distribution to a specific mode of the target data distribution. As such, every point in the initial source distribution can be mapped to every point in the target distribution, resulting in long average paths. To this end, in this work, we tap into a non-utilized property of conditional flow-based models: the ability to design a non-trivial prior distribution. Given an input condition, such as a text prompt, we first map it to a point lying in data space, representing an ``average" data point with the minimal average distance to all data points of the same conditional mode (e.g., class). We then utilize the flow matching formulation to map samples from a parametric distribution centered around this point to the conditional target distribution. Experimentally, our method significantly improves training times and generation efficiency (FID, KID and CLIP alignment scores) compared to baselines, producing high quality samples using fewer sampling steps.

View on arXiv
@article{issachar2025_2502.09611,
  title={ Designing a Conditional Prior Distribution for Flow-Based Generative Models },
  author={ Noam Issachar and Mohammad Salama and Raanan Fattal and Sagie Benaim },
  journal={arXiv preprint arXiv:2502.09611},
  year={ 2025 }
}
Comments on this paper