ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.05729
  4. Cited By
CPT: A Pre-Trained Unbalanced Transformer for Both Chinese Language
  Understanding and Generation

CPT: A Pre-Trained Unbalanced Transformer for Both Chinese Language Understanding and Generation

13 September 2021
Yunfan Shao
Zhichao Geng
Yitao Liu
Junqi Dai
Hang Yan
Fei Yang
Li Zhe
Hujun Bao
Xipeng Qiu
    MedIm
ArXivPDFHTML

Papers citing "CPT: A Pre-Trained Unbalanced Transformer for Both Chinese Language Understanding and Generation"

3 / 3 papers shown
Title
CSED: A Chinese Semantic Error Diagnosis Corpus
CSED: A Chinese Semantic Error Diagnosis Corpus
Bo Sun
Baoxin Wang
Yixuan Wang
Wanxiang Che
Dayong Wu
Shijin Wang
Ting Liu
15
4
0
09 May 2023
Making Pre-trained Language Models Better Few-shot Learners
Making Pre-trained Language Models Better Few-shot Learners
Tianyu Gao
Adam Fisch
Danqi Chen
238
1,898
0
31 Dec 2020
Pre-trained Models for Natural Language Processing: A Survey
Pre-trained Models for Natural Language Processing: A Survey
Xipeng Qiu
Tianxiang Sun
Yige Xu
Yunfan Shao
Ning Dai
Xuanjing Huang
LM&MA
VLM
224
1,281
0
18 Mar 2020
1