Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2005.10389
Cited By
Pretraining with Contrastive Sentence Objectives Improves Discourse Performance of Language Models
20 May 2020
Dan Iter
Kelvin Guu
L. Lansing
Dan Jurafsky
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Pretraining with Contrastive Sentence Objectives Improves Discourse Performance of Language Models"
5 / 55 papers shown
Title
CoDA: Contrast-enhanced and Diversity-promoting Data Augmentation for Natural Language Understanding
Yanru Qu
Dinghan Shen
Yelong Shen
Sandra Sajeev
Jiawei Han
Weizhu Chen
204
69
0
16 Oct 2020
Unsupervised Extractive Summarization by Pre-training Hierarchical Transformers
Shusheng Xu
Xingxing Zhang
Yi Wu
Furu Wei
Ming Zhou
113
45
0
16 Oct 2020
Enhancing Dialogue Generation via Multi-Level Contrastive Learning
Xin Li
Piji Li
Yan Wang
Xiaojiang Liu
Wai Lam
49
5
0
19 Sep 2020
On Learning Universal Representations Across Languages
Xiangpeng Wei
Rongxiang Weng
Yue Hu
Luxi Xing
Heng Yu
Weihua Luo
SSL
VLM
99
87
0
31 Jul 2020
Stay Hungry, Stay Focused: Generating Informative and Specific Questions in Information-Seeking Conversations
Peng Qi
Yuhao Zhang
Christopher D. Manning
93
38
0
30 Apr 2020
Previous
1
2