Waste Not, Want Not; Recycled Gumbel Noise Improves Consistency in Natural Language Generation
Consistency in the output of language models is critical for their reliability and practical utility. Due to their training objective, language models learn to model the full space of possible continuations, leading to outputs that can vary significantly in style and content, even for similar or repeated inputs. To address this, we propose a novel decoding algorithm that enhances response consistency across different prompts with no degradation in response quality. By incorporating a latent variable into the next-token sampling process based on the Gumbel reparametrisation trick, our method outperforms standard sampling by up to 10% across semantic and stylistic consistency benchmarks. Additionally, our approach integrates seamlessly with existing sampling methods with negligible computational overhead, providing a practical solution for improving the reliability of language model outputs.
View on arXiv@article{mijolla2025_2503.00831, title={ Waste Not, Want Not; Recycled Gumbel Noise Improves Consistency in Natural Language Generation }, author={ Damien de Mijolla and Hannan Saddiq and Kim Moore }, journal={arXiv preprint arXiv:2503.00831}, year={ 2025 } }