ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.12148
  4. Cited By
ERNIE-Gram: Pre-Training with Explicitly N-Gram Masked Language Modeling
  for Natural Language Understanding

ERNIE-Gram: Pre-Training with Explicitly N-Gram Masked Language Modeling for Natural Language Understanding

23 October 2020
Dongling Xiao
Yukun Li
Han Zhang
Yu Sun
Hao Tian
Hua-Hong Wu
Haifeng Wang
ArXivPDFHTML

Papers citing "ERNIE-Gram: Pre-Training with Explicitly N-Gram Masked Language Modeling for Natural Language Understanding"

3 / 3 papers shown
Title
Unsupervised Boundary-Aware Language Model Pretraining for Chinese
  Sequence Labeling
Unsupervised Boundary-Aware Language Model Pretraining for Chinese Sequence Labeling
Peijie Jiang
Dingkun Long
Yanzhao Zhang
Pengjun Xie
Meishan Zhang
M. Zhang
SSL
15
12
0
27 Oct 2022
Analysing the Effect of Masking Length Distribution of MLM: An
  Evaluation Framework and Case Study on Chinese MRC Datasets
Analysing the Effect of Masking Length Distribution of MLM: An Evaluation Framework and Case Study on Chinese MRC Datasets
Changchang Zeng
Shaobo Li
14
6
0
29 Sep 2021
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
294
6,943
0
20 Apr 2018
1