ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.05424
44
44

Pairwise Supervised Contrastive Learning of Sentence Representations

12 September 2021
Dejiao Zhang
Shang-Wen Li
Wei Xiao
Henghui Zhu
Ramesh Nallapati
Andrew O. Arnold
Bing Xiang
    SSL
ArXivPDFHTML
Abstract

Many recent successes in sentence representation learning have been achieved by simply fine-tuning on the Natural Language Inference (NLI) datasets with triplet loss or siamese loss. Nevertheless, they share a common weakness: sentences in a contradiction pair are not necessarily from different semantic categories. Therefore, optimizing the semantic entailment and contradiction reasoning objective alone is inadequate to capture the high-level semantic structure. The drawback is compounded by the fact that the vanilla siamese or triplet losses only learn from individual sentence pairs or triplets, which often suffer from bad local optima. In this paper, we propose PairSupCon, an instance discrimination based approach aiming to bridge semantic entailment and contradiction understanding with high-level categorical concept encoding. We evaluate PairSupCon on various downstream tasks that involve understanding sentence semantics at different granularities. We outperform the previous state-of-the-art method with 10%10\%10%--13%13\%13% averaged improvement on eight clustering tasks, and 5%5\%5%--6%6\%6% averaged improvement on seven semantic textual similarity (STS) tasks.

View on arXiv
Comments on this paper