47
0

DEUCE: Dual-diversity Enhancement and Uncertainty-awareness for Cold-start Active Learning

Abstract

Cold-start active learning (CSAL) selects valuable instances from an unlabeled dataset for manual annotation. It provides high-quality data at a low annotation cost for label-scarce text classification. However, existing CSAL methods overlook weak classes and hard representative examples, resulting in biased learning. To address these issues, this paper proposes a novel dual-diversity enhancing and uncertainty-aware (DEUCE) framework for CSAL. Specifically, DEUCE leverages a pretrained language model (PLM) to efficiently extract textual representations, class predictions, and predictive uncertainty. Then, it constructs a Dual-Neighbor Graph (DNG) to combine information on both textual diversity and class diversity, ensuring a balanced data distribution. It further propagates uncertainty information via density-based clustering to select hard representative instances. DEUCE performs well in selecting class-balanced and hard representative data by dual-diversity and informativeness. Experiments on six NLP datasets demonstrate the superiority and efficiency of DEUCE.

View on arXiv
@article{guo2025_2502.00305,
  title={ DEUCE: Dual-diversity Enhancement and Uncertainty-awareness for Cold-start Active Learning },
  author={ Jiaxin Guo and C. L. Philip Chen and Shuzhen Li and Tong Zhang },
  journal={arXiv preprint arXiv:2502.00305},
  year={ 2025 }
}
Comments on this paper