ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2005.10389
  4. Cited By
Pretraining with Contrastive Sentence Objectives Improves Discourse
  Performance of Language Models

Pretraining with Contrastive Sentence Objectives Improves Discourse Performance of Language Models

20 May 2020
Dan Iter
Kelvin Guu
L. Lansing
Dan Jurafsky
ArXiv (abs)PDFHTML

Papers citing "Pretraining with Contrastive Sentence Objectives Improves Discourse Performance of Language Models"

5 / 55 papers shown
Title
CoDA: Contrast-enhanced and Diversity-promoting Data Augmentation for
  Natural Language Understanding
CoDA: Contrast-enhanced and Diversity-promoting Data Augmentation for Natural Language Understanding
Yanru Qu
Dinghan Shen
Yelong Shen
Sandra Sajeev
Jiawei Han
Weizhu Chen
204
69
0
16 Oct 2020
Unsupervised Extractive Summarization by Pre-training Hierarchical
  Transformers
Unsupervised Extractive Summarization by Pre-training Hierarchical Transformers
Shusheng Xu
Xingxing Zhang
Yi Wu
Furu Wei
Ming Zhou
113
45
0
16 Oct 2020
Enhancing Dialogue Generation via Multi-Level Contrastive Learning
Enhancing Dialogue Generation via Multi-Level Contrastive Learning
Xin Li
Piji Li
Yan Wang
Xiaojiang Liu
Wai Lam
49
5
0
19 Sep 2020
On Learning Universal Representations Across Languages
On Learning Universal Representations Across Languages
Xiangpeng Wei
Rongxiang Weng
Yue Hu
Luxi Xing
Heng Yu
Weihua Luo
SSLVLM
99
87
0
31 Jul 2020
Stay Hungry, Stay Focused: Generating Informative and Specific Questions
  in Information-Seeking Conversations
Stay Hungry, Stay Focused: Generating Informative and Specific Questions in Information-Seeking Conversations
Peng Qi
Yuhao Zhang
Christopher D. Manning
93
38
0
30 Apr 2020
Previous
12