ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2105.14478
  4. Cited By
Pre-training Universal Language Representation

Pre-training Universal Language Representation

30 May 2021
Yian Li
Hai Zhao
    SSL
ArXivPDFHTML

Papers citing "Pre-training Universal Language Representation"

4 / 4 papers shown
Title
Language Model Pre-training on True Negatives
Language Model Pre-training on True Negatives
Zhuosheng Zhang
Hai Zhao
Masao Utiyama
Eiichiro Sumita
22
2
0
01 Dec 2022
Learning Better Masking for Better Language Model Pre-training
Learning Better Masking for Better Language Model Pre-training
Dongjie Yang
Zhuosheng Zhang
Hai Zhao
16
15
0
23 Aug 2022
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
294
6,943
0
20 Apr 2018
Efficient Estimation of Word Representations in Vector Space
Efficient Estimation of Word Representations in Vector Space
Tomáš Mikolov
Kai Chen
G. Corrado
J. Dean
3DV
228
31,150
0
16 Jan 2013
1