ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2205.12914
18
40

New Intent Discovery with Pre-training and Contrastive Learning

25 May 2022
Yuwei Zhang
Haode Zhang
Li-Ming Zhan
Albert Y.S. Lam
Albert Y. S. Lam
    SSL
    VLM
ArXivPDFHTML
Abstract

New intent discovery aims to uncover novel intent categories from user utterances to expand the set of supported intent classes. It is a critical task for the development and service expansion of a practical dialogue system. Despite its importance, this problem remains under-explored in the literature. Existing approaches typically rely on a large amount of labeled utterances and employ pseudo-labeling methods for representation learning and clustering, which are label-intensive, inefficient, and inaccurate. In this paper, we provide new solutions to two important research questions for new intent discovery: (1) how to learn semantic utterance representations and (2) how to better cluster utterances. Particularly, we first propose a multi-task pre-training strategy to leverage rich unlabeled data along with external labeled data for representation learning. Then, we design a new contrastive loss to exploit self-supervisory signals in unlabeled data for clustering. Extensive experiments on three intent recognition benchmarks demonstrate the high effectiveness of our proposed method, which outperforms state-of-the-art methods by a large margin in both unsupervised and semi-supervised scenarios. The source code will be available atthis https URL.

View on arXiv
@article{zhang2025_2205.12914,
  title={ New Intent Discovery with Pre-training and Contrastive Learning },
  author={ Yuwei Zhang and Haode Zhang and Li-Ming Zhan and Albert Y.S. Lam and Xiao-Ming Wu },
  journal={arXiv preprint arXiv:2205.12914},
  year={ 2025 }
}
Comments on this paper