ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.06624
22
0

The Efficiency of Pre-training with Objective Masking in Pseudo Labeling for Semi-Supervised Text Classification

10 May 2025
Arezoo Hatefi
Xuan-Son Vu
Monowar Bhuyan
Frank Drewes
    VLM
ArXivPDFHTML
Abstract

We extend and study a semi-supervised model for text classification proposed earlier by Hatefi et al. for classification tasks in which document classes are described by a small number of gold-labeled examples, while the majority of training examples is unlabeled. The model leverages the teacher-student architecture of Meta Pseudo Labels in which a ''teacher'' generates labels for originally unlabeled training data to train the ''student'' and updates its own model iteratively based on the performance of the student on the gold-labeled portion of the data. We extend the original model of Hatefi et al. by an unsupervised pre-training phase based on objective masking, and conduct in-depth performance evaluations of the original model, our extension, and various independent baselines. Experiments are performed using three different datasets in two different languages (English and Swedish).

View on arXiv
@article{hatefi2025_2505.06624,
  title={ The Efficiency of Pre-training with Objective Masking in Pseudo Labeling for Semi-Supervised Text Classification },
  author={ Arezoo Hatefi and Xuan-Son Vu and Monowar Bhuyan and Frank Drewes },
  journal={arXiv preprint arXiv:2505.06624},
  year={ 2025 }
}
Comments on this paper