Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2101.04758
Cited By
Self-Training Pre-Trained Language Models for Zero- and Few-Shot Multi-Dialectal Arabic Sequence Labeling
12 January 2021
Muhammad Khalifa
Muhammad Abdul-Mageed
Khaled Shaalan
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Self-Training Pre-Trained Language Models for Zero- and Few-Shot Multi-Dialectal Arabic Sequence Labeling"
4 / 4 papers shown
Title
UM6P-CS at SemEval-2022 Task 11: Enhancing Multilingual and Code-Mixed Complex Named Entity Recognition via Pseudo Labels using Multilingual Transformer
Abdellah El Mekki
Abdelkader El Mahdaouy
Mohammed Akallouch
Ismail Berrada
A. Khoumsi
11
2
0
28 Apr 2022
Morphosyntactic Tagging with Pre-trained Language Models for Arabic and its Dialects
Go Inoue
Salam Khalifa
Nizar Habash
38
25
0
13 Oct 2021
ARBERT & MARBERT: Deep Bidirectional Transformers for Arabic
Muhammad Abdul-Mageed
AbdelRahim Elmadany
El Moatez Billah Nagoudi
VLM
62
449
0
27 Dec 2020
Efficient Estimation of Word Representations in Vector Space
Tomas Mikolov
Kai Chen
G. Corrado
J. Dean
3DV
317
31,297
0
16 Jan 2013
1