Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2205.10687
Cited By
Revisiting Pre-trained Language Models and their Evaluation for Arabic Natural Language Understanding
21 May 2022
Abbas Ghaddar
Yimeng Wu
Sunyam Bagga
Ahmad Rashid
Khalil Bibi
Mehdi Rezagholizadeh
Chao Xing
Yasheng Wang
Duan Xinyu
Zhefeng Wang
Baoxing Huai
Xin Jiang
Qun Liu
Philippe Langlais
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Revisiting Pre-trained Language Models and their Evaluation for Arabic Natural Language Understanding"
5 / 5 papers shown
Title
AraT5: Text-to-Text Transformers for Arabic Language Generation
El Moatez Billah Nagoudi
AbdelRahim Elmadany
Muhammad Abdul-Mageed
84
117
0
31 Aug 2021
ARBERT & MARBERT: Deep Bidirectional Transformers for Arabic
Muhammad Abdul-Mageed
AbdelRahim Elmadany
El Moatez Billah Nagoudi
VLM
60
444
0
27 Dec 2020
CharacterBERT: Reconciling ELMo and BERT for Word-Level Open-Vocabulary Representations From Characters
Hicham El Boukkouri
Olivier Ferret
Thomas Lavergne
Hiroshi Noji
Pierre Zweigenbaum
Junichi Tsujii
66
155
0
20 Oct 2020
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
294
6,927
0
20 Apr 2018
Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation
Yonghui Wu
M. Schuster
Z. Chen
Quoc V. Le
Mohammad Norouzi
...
Alex Rudnick
Oriol Vinyals
G. Corrado
Macduff Hughes
J. Dean
AIMat
716
6,724
0
26 Sep 2016
1