Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1905.03197
Cited By
Unified Language Model Pre-training for Natural Language Understanding and Generation
8 May 2019
Li Dong
Nan Yang
Wenhui Wang
Furu Wei
Xiaodong Liu
Yu-Chiang Frank Wang
Jianfeng Gao
M. Zhou
H. Hon
ELM
AI4CE
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Unified Language Model Pre-training for Natural Language Understanding and Generation"
50 / 845 papers shown
Title
Benchmarking Robustness of Machine Reading Comprehension Models
Chenglei Si
Ziqing Yang
Yiming Cui
Wentao Ma
Ting Liu
Shijin Wang
ELM
AAML
12
42
0
29 Apr 2020
Data Augmentation for Spoken Language Understanding via Pretrained Language Models
Baolin Peng
Chenguang Zhu
Michael Zeng
Jianfeng Gao
12
25
0
29 Apr 2020
VD-BERT: A Unified Vision and Dialog Transformer with BERT
Yue Wang
Shafiq R. Joty
Michael R. Lyu
Irwin King
Caiming Xiong
S. Hoi
19
102
0
28 Apr 2020
MATINF: A Jointly Labeled Large-Scale Dataset for Classification, Question Answering and Summarization
Canwen Xu
Jiaxin Pei
Hongtao Wu
Yiyu Liu
Chenliang Li
MLLM
VLM
6
14
0
26 Apr 2020
Probabilistically Masked Language Model Capable of Autoregressive Generation in Arbitrary Word Order
Yi-Lun Liao
Xin Jiang
Qun Liu
15
40
0
24 Apr 2020
Self-Attention Attribution: Interpreting Information Interactions Inside Transformer
Y. Hao
Li Dong
Furu Wei
Ke Xu
ViT
6
211
0
23 Apr 2020
QURIOUS: Question Generation Pretraining for Text Generation
Shashi Narayan
Gonçalo Simães
Ji Ma
Hannah Craighead
Ryan T. McDonald
21
15
0
23 Apr 2020
VisualCOMET: Reasoning about the Dynamic Context of a Still Image
J. S. Park
Chandra Bhagavatula
Roozbeh Mottaghi
Ali Farhadi
Yejin Choi
ReLM
LRM
11
6
0
22 Apr 2020
MPNet: Masked and Permuted Pre-training for Language Understanding
Kaitao Song
Xu Tan
Tao Qin
Jianfeng Lu
Tie-Yan Liu
8
1,061
0
20 Apr 2020
Adversarial Training for Large Neural Language Models
Xiaodong Liu
Hao Cheng
Pengcheng He
Weizhu Chen
Yu-Chiang Frank Wang
Hoifung Poon
Jianfeng Gao
AAML
15
183
0
20 Apr 2020
Syn-QG: Syntactic and Shallow Semantic Rules for Question Generation
Kaustubh D. Dhole
Christopher D. Manning
14
53
0
18 Apr 2020
TOD-BERT: Pre-trained Natural Language Understanding for Task-Oriented Dialogue
Chien-Sheng Wu
S. Hoi
R. Socher
Caiming Xiong
6
319
0
15 Apr 2020
Pre-training Text Representations as Meta Learning
Shangwen Lv
Yuechen Wang
Daya Guo
Duyu Tang
Nan Duan
...
Ryan Ma
Daxin Jiang
Guihong Cao
Ming Zhou
Songlin Hu
AIMat
SSL
AI4CE
6
8
0
12 Apr 2020
Improving Readability for Automatic Speech Recognition Transcription
Junwei Liao
Sefik Emre Eskimez
Liyang Lu
Yu Shi
Ming Gong
Linjun Shou
Hong Qu
Michael Zeng
16
55
0
09 Apr 2020
Exploring Versatile Generative Language Model Via Parameter-Efficient Transfer Learning
Zhaojiang Lin
Andrea Madotto
Pascale Fung
10
154
0
08 Apr 2020
Improving the Robustness of QA Models to Challenge Sets with Variational Question-Answer Pair Generation
Kazutoshi Shinoda
Saku Sugawara
Akiko Aizawa
OOD
14
11
0
07 Apr 2020
Deep Learning Based Text Classification: A Comprehensive Review
Shervin Minaee
Nal Kalchbrenner
Erik Cambria
Narjes Nikzad
M. Asgari-Chenaghlu
Jianfeng Gao
AILaw
VLM
AI4TS
14
1,084
0
06 Apr 2020
Optimus: Organizing Sentences via Pre-trained Modeling of a Latent Space
Chunyuan Li
Xiang Gao
Yuan Li
Baolin Peng
Xiujun Li
Yizhe Zhang
Jianfeng Gao
SSL
DRL
14
181
0
05 Apr 2020
Learning a Simple and Effective Model for Multi-turn Response Generation with Auxiliary Tasks
Yufan Zhao
Can Xu
Wei Yu Wu
Lei Yu
13
28
0
04 Apr 2020
Conversational Question Reformulation via Sequence-to-Sequence Architectures and Pretrained Language Models
Sheng-Chieh Lin
Jheng-Hong Yang
Rodrigo Nogueira
Ming-Feng Tsai
Chuan-Ju Wang
Jimmy J. Lin
6
76
0
04 Apr 2020
CG-BERT: Conditional Text Generation with BERT for Generalized Few-shot Intent Detection
Congying Xia
Chenwei Zhang
Hoang Nguyen
Jiawei Zhang
Philip Yu
6
42
0
04 Apr 2020
Pre-training for Abstractive Document Summarization by Reinstating Source Text
Yanyan Zou
Xingxing Zhang
Wei Lu
Furu Wei
Ming Zhou
17
1
0
04 Apr 2020
XGLUE: A New Benchmark Dataset for Cross-lingual Pre-training, Understanding and Generation
Yaobo Liang
Nan Duan
Yeyun Gong
Ning Wu
Fenfei Guo
...
Shuguang Liu
Fan Yang
Daniel Fernando Campos
Rangan Majumder
Ming Zhou
ELM
VLM
35
338
0
03 Apr 2020
Code Prediction by Feeding Trees to Transformers
Seohyun Kim
Jinman Zhao
Yuchi Tian
S. Chandra
17
213
0
30 Mar 2020
Abstractive Summarization with Combination of Pre-trained Sequence-to-Sequence and Saliency Models
Itsumi Saito
Kyosuke Nishida
Kosuke Nishida
J. Tomita
9
28
0
29 Mar 2020
Enhancing Factual Consistency of Abstractive Summarization
Chenguang Zhu
William Fu-Hinthorn
Ruochen Xu
Qingkai Zeng
Michael Zeng
Xuedong Huang
Meng-Long Jiang
HILM
KELM
183
39
0
19 Mar 2020
Pre-trained Models for Natural Language Processing: A Survey
Xipeng Qiu
Tianxiang Sun
Yige Xu
Yunfan Shao
Ning Dai
Xuanjing Huang
LM&MA
VLM
241
1,444
0
18 Mar 2020
A Survey on Contextual Embeddings
Qi Liu
Matt J. Kusner
Phil Blunsom
214
146
0
16 Mar 2020
Document Ranking with a Pretrained Sequence-to-Sequence Model
Rodrigo Nogueira
Zhiying Jiang
Jimmy J. Lin
6
554
0
14 Mar 2020
An Empirical Investigation of Pre-Trained Transformer Language Models for Open-Domain Dialogue Generation
Piji Li
6
15
0
09 Mar 2020
XGPT: Cross-modal Generative Pre-Training for Image Captioning
Qiaolin Xia
Haoyang Huang
Nan Duan
Dongdong Zhang
Lei Ji
Zhifang Sui
Edward Cui
Taroon Bharti
Xin Liu
Ming Zhou
MLLM
VLM
12
74
0
03 Mar 2020
UniLMv2: Pseudo-Masked Language Models for Unified Language Model Pre-Training
Hangbo Bao
Li Dong
Furu Wei
Wenhui Wang
Nan Yang
...
Yu-Chiang Frank Wang
Songhao Piao
Jianfeng Gao
Ming Zhou
H. Hon
AI4CE
8
389
0
28 Feb 2020
MiniLM: Deep Self-Attention Distillation for Task-Agnostic Compression of Pre-Trained Transformers
Wenhui Wang
Furu Wei
Li Dong
Hangbo Bao
Nan Yang
Ming Zhou
VLM
15
1,192
0
25 Feb 2020
What BERT Sees: Cross-Modal Transfer for Visual Question Generation
Thomas Scialom
Patrick Bordes
Paul-Alexis Dray
Jacopo Staiano
Patrick Gallinari
12
6
0
25 Feb 2020
Discriminative Adversarial Search for Abstractive Summarization
Thomas Scialom
Paul-Alexis Dray
Sylvain Lamprier
Benjamin Piwowarski
Jacopo Staiano
15
33
0
24 Feb 2020
Improving BERT Fine-Tuning via Self-Ensemble and Self-Distillation
Yige Xu
Xipeng Qiu
L. Zhou
Xuanjing Huang
6
65
0
24 Feb 2020
Fill in the BLANC: Human-free quality estimation of document summaries
Oleg V. Vasilyev
Vedant Dharnidharka
John Bohannon
3DH
12
116
0
23 Feb 2020
ScopeIt: Scoping Task Relevant Sentences in Documents
Vishwas Suryanarayanan
Barun Patra
P. Bhattacharya
C. Fufa
Charles Lee
6
4
0
23 Feb 2020
Training Question Answering Models From Synthetic Data
Raul Puri
Ryan Spring
M. Patwary
M. Shoeybi
Bryan Catanzaro
ELM
13
157
0
22 Feb 2020
The Microsoft Toolkit of Multi-Task Deep Neural Networks for Natural Language Understanding
Xiaodong Liu
Yu-Chiang Frank Wang
Jianshu Ji
Hao Cheng
Xueyun Zhu
...
Pengcheng He
Weizhu Chen
Hoifung Poon
Guihong Cao
Jianfeng Gao
AI4CE
12
60
0
19 Feb 2020
UniVL: A Unified Video and Language Pre-Training Model for Multimodal Understanding and Generation
Huaishao Luo
Lei Ji
Botian Shi
Haoyang Huang
Nan Duan
Tianrui Li
Jason Li
Xilin Chen
Ming Zhou
VLM
32
434
0
15 Feb 2020
Deep Learning for Source Code Modeling and Generation: Models, Applications and Challenges
T. H. Le
Hao Chen
Muhammad Ali Babar
VLM
54
152
0
13 Feb 2020
BERT-of-Theseus: Compressing BERT by Progressive Module Replacing
Canwen Xu
Wangchunshu Zhou
Tao Ge
Furu Wei
Ming Zhou
221
196
0
07 Feb 2020
Aligning the Pretraining and Finetuning Objectives of Language Models
Nuo Wang Pierse
Jing Lu
AI4CE
12
2
0
05 Feb 2020
ERNIE-GEN: An Enhanced Multi-Flow Pre-training and Fine-tuning Framework for Natural Language Generation
Dongling Xiao
Han Zhang
Yukun Li
Yu Sun
Hao Tian
Hua-Hong Wu
Haifeng Wang
12
123
0
26 Jan 2020
Multilingual Denoising Pre-training for Neural Machine Translation
Yinhan Liu
Jiatao Gu
Naman Goyal
Xian Li
Sergey Edunov
Marjan Ghazvininejad
M. Lewis
Luke Zettlemoyer
AI4CE
AIMat
17
1,766
0
22 Jan 2020
Length-controllable Abstractive Summarization by Guiding with Summary Prototype
Itsumi Saito
Kyosuke Nishida
Kosuke Nishida
Atsushi Otsuka
Hisako Asano
J. Tomita
Hiroyuki Shindo
Yuji Matsumoto
6
33
0
21 Jan 2020
A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation
Jian-Yu Guan
Fei Huang
Zhihao Zhao
Xiaoyan Zhu
Minlie Huang
LRM
SyDa
6
242
0
15 Jan 2020
ProphetNet: Predicting Future N-gram for Sequence-to-Sequence Pre-training
Weizhen Qi
Yu Yan
Yeyun Gong
Dayiheng Liu
Nan Duan
Jiusheng Chen
Ruofei Zhang
Ming Zhou
AI4TS
13
441
0
13 Jan 2020
TED: A Pretrained Unsupervised Summarization Model with Theme Modeling and Denoising
Ziyi Yang
Chenguang Zhu
R. Gmyr
Michael Zeng
Xuedong Huang
Eric Darve
13
61
0
03 Jan 2020
Previous
1
2
3
...
15
16
17
Next