ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2102.03752
  4. Cited By
CSS-LM: A Contrastive Framework for Semi-supervised Fine-tuning of
  Pre-trained Language Models
v1v2v3 (latest)

CSS-LM: A Contrastive Framework for Semi-supervised Fine-tuning of Pre-trained Language Models

IEEE/ACM Transactions on Audio Speech and Language Processing (TASLP), 2021
7 February 2021
Yusheng Su
Xu Han
Yankai Lin
Zhengyan Zhang
Zhiyuan Liu
Peng Li
Jie Zhou
Maosong Sun
ArXiv (abs)PDFHTML

Papers citing "CSS-LM: A Contrastive Framework for Semi-supervised Fine-tuning of Pre-trained Language Models"

5 / 5 papers shown
Span-level Emotion-Cause-Category Triplet Extraction with Instruction Tuning LLMs and Data Augmentation
Span-level Emotion-Cause-Category Triplet Extraction with Instruction Tuning LLMs and Data AugmentationApplied Soft Computing (ASC), 2025
Xuelong Li
Dong Yang
Xiaogang Zhu
Faliang Huang
Peng Zhang
Zhongying Zhao
340
1
0
13 Apr 2025
Slot Induction via Pre-trained Language Model Probing and Multi-level
  Contrastive Learning
Slot Induction via Pre-trained Language Model Probing and Multi-level Contrastive LearningSIGDIAL Conferences (SIGDIAL), 2023
Hoang Nguyen
Chenwei Zhang
Ye Liu
Philip S. Yu
198
6
0
09 Aug 2023
Vesper: A Compact and Effective Pretrained Model for Speech Emotion
  Recognition
Vesper: A Compact and Effective Pretrained Model for Speech Emotion RecognitionIEEE Transactions on Affective Computing (IEEE Trans. Affective Comput.), 2023
Weidong Chen
Xiaofen Xing
Peihao Chen
Xiangmin Xu
VLM
345
75
0
20 Jul 2023
Transfer-Free Data-Efficient Multilingual Slot Labeling
Transfer-Free Data-Efficient Multilingual Slot LabelingConference on Empirical Methods in Natural Language Processing (EMNLP), 2023
E. Razumovskaia
Ivan Vulić
Anna Korhonen
363
5
0
22 May 2023
A Primer on Contrastive Pretraining in Language Processing: Methods,
  Lessons Learned and Perspectives
A Primer on Contrastive Pretraining in Language Processing: Methods, Lessons Learned and PerspectivesACM Computing Surveys (CSUR), 2021
Nils Rethmeier
Isabelle Augenstein
SSLVLM
322
117
0
25 Feb 2021
1
Page 1 of 1