ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.10683
40
0

Understanding the Quality-Diversity Trade-off in Diffusion Language Models

11 March 2025
Zak Buzzard
    DiffM
ArXivPDFHTML
Abstract

Diffusion models have seen immense success in modelling continuous data across a range of domains such as vision and audio. Despite the challenges of adapting diffusion models to discrete data, recent work explores their application to text generation by working in the continuous embedding space. However, these models lack a natural means to control the inherent trade-off between quality and diversity as afforded by the temperature hyperparameter in autoregressive models, hindering understanding of model performance and restricting generation quality. This work proposes the use of classifier-free guidance and stochastic clamping for manipulating the quality-diversity trade-off on sequence-to-sequence tasks, demonstrating that these techniques may be used to improve the performance of a diffusion language model.

View on arXiv
@article{buzzard2025_2503.10683,
  title={ Understanding the Quality-Diversity Trade-off in Diffusion Language Models },
  author={ Zak Buzzard },
  journal={arXiv preprint arXiv:2503.10683},
  year={ 2025 }
}
Comments on this paper