ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.09790
47
0

Constrained Language Generation with Discrete Diffusion Models

12 March 2025
Michael Cardei
Jacob K Christopher
Thomas Hartvigsen
Brian Bartoldson
B. Kailkhura
Ferdinando Fioretto
    DiffM
ArXivPDFHTML
Abstract

Constraints are critical in text generation as LLM outputs are often unreliable when it comes to ensuring generated outputs adhere to user defined instruction or general safety guidelines. To address this gap, we present Constrained Discrete Diffusion (CDD), a novel method for enforcing constraints on natural language by integrating discrete diffusion models with differentiable optimization. Unlike conventional text generators, which often rely on post-hoc filtering or model retraining for controllable generation, we propose imposing constraints directly into the discrete diffusion sampling process. We illustrate how this technique can be applied to satisfy a variety of natural language constraints, including (i) toxicity mitigation by preventing harmful content from emerging, (ii) character and sequence level lexical constraints, and (iii) novel molecule sequence generation with specific property adherence. Experimental results show that our constraint-aware procedure achieves high fidelity in meeting these requirements while preserving fluency and semantic coherence, outperforming auto-regressive and existing discrete diffusion approaches.

View on arXiv
@article{cardei2025_2503.09790,
  title={ Constrained Language Generation with Discrete Diffusion Models },
  author={ Michael Cardei and Jacob K Christopher and Thomas Hartvigsen and Brian R. Bartoldson and Bhavya Kailkhura and Ferdinando Fioretto },
  journal={arXiv preprint arXiv:2503.09790},
  year={ 2025 }
}
Comments on this paper