Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2203.06906
Cited By
PERT: Pre-training BERT with Permuted Language Model
14 March 2022
Yiming Cui
Ziqing Yang
Ting Liu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"PERT: Pre-training BERT with Permuted Language Model"
7 / 7 papers shown
Title
Language Models at the Syntax-Semantics Interface: A Case Study of the Long-Distance Binding of Chinese Reflexive ziji
Xiulin Yang
35
0
0
02 Apr 2025
MERBench: A Unified Evaluation Benchmark for Multimodal Emotion Recognition
Zheng Lian
Licai Sun
Yong Ren
Hao Gu
Haiyang Sun
Lan Chen
Bin Liu
Jianhua Tao
11
12
0
07 Jan 2024
Knowing-how & Knowing-that: A New Task for Machine Comprehension of User Manuals
Hongru Liang
Jia-Wei Liu
Weihong Du
dingnan jin
Wenqiang Lei
Zujie Wen
Jiancheng Lv
13
1
0
07 Jun 2023
LERT: A Linguistically-motivated Pre-trained Language Model
Yiming Cui
Wanxiang Che
Shijin Wang
Ting Liu
23
24
0
10 Nov 2022
Knowing Where and What: Unified Word Block Pretraining for Document Understanding
Song Tao
Zijian Wang
Tiantian Fan
Canjie Luo
Can Huang
SSL
27
2
0
28 Jul 2022
Position Prediction as an Effective Pretraining Strategy
Shuangfei Zhai
Navdeep Jaitly
Jason Ramapuram
Dan Busbridge
Tatiana Likhomanenko
Joseph Y. Cheng
Walter A. Talbott
Chen Huang
Hanlin Goh
J. Susskind
ViT
32
23
0
15 Jul 2022
Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation
Yonghui Wu
M. Schuster
Z. Chen
Quoc V. Le
Mohammad Norouzi
...
Alex Rudnick
Oriol Vinyals
G. Corrado
Macduff Hughes
J. Dean
AIMat
716
6,740
0
26 Sep 2016
1