Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2109.05729
Cited By
CPT: A Pre-Trained Unbalanced Transformer for Both Chinese Language Understanding and Generation
13 September 2021
Yunfan Shao
Zhichao Geng
Yitao Liu
Junqi Dai
Hang Yan
Fei Yang
Li Zhe
Hujun Bao
Xipeng Qiu
MedIm
Re-assign community
ArXiv
PDF
HTML
Papers citing
"CPT: A Pre-Trained Unbalanced Transformer for Both Chinese Language Understanding and Generation"
2 / 2 papers shown
Title
Making Pre-trained Language Models Better Few-shot Learners
Tianyu Gao
Adam Fisch
Danqi Chen
238
1,898
0
31 Dec 2020
Pre-trained Models for Natural Language Processing: A Survey
Xipeng Qiu
Tianxiang Sun
Yige Xu
Yunfan Shao
Ning Dai
Xuanjing Huang
LM&MA
VLM
224
1,281
0
18 Mar 2020
1