ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2101.04758
  4. Cited By
Self-Training Pre-Trained Language Models for Zero- and Few-Shot
  Multi-Dialectal Arabic Sequence Labeling

Self-Training Pre-Trained Language Models for Zero- and Few-Shot Multi-Dialectal Arabic Sequence Labeling

12 January 2021
Muhammad Khalifa
Muhammad Abdul-Mageed
Khaled Shaalan
ArXivPDFHTML

Papers citing "Self-Training Pre-Trained Language Models for Zero- and Few-Shot Multi-Dialectal Arabic Sequence Labeling"

4 / 4 papers shown
Title
UM6P-CS at SemEval-2022 Task 11: Enhancing Multilingual and Code-Mixed
  Complex Named Entity Recognition via Pseudo Labels using Multilingual
  Transformer
UM6P-CS at SemEval-2022 Task 11: Enhancing Multilingual and Code-Mixed Complex Named Entity Recognition via Pseudo Labels using Multilingual Transformer
Abdellah El Mekki
Abdelkader El Mahdaouy
Mohammed Akallouch
Ismail Berrada
A. Khoumsi
11
2
0
28 Apr 2022
Morphosyntactic Tagging with Pre-trained Language Models for Arabic and
  its Dialects
Morphosyntactic Tagging with Pre-trained Language Models for Arabic and its Dialects
Go Inoue
Salam Khalifa
Nizar Habash
38
25
0
13 Oct 2021
ARBERT & MARBERT: Deep Bidirectional Transformers for Arabic
ARBERT & MARBERT: Deep Bidirectional Transformers for Arabic
Muhammad Abdul-Mageed
AbdelRahim Elmadany
El Moatez Billah Nagoudi
VLM
62
449
0
27 Dec 2020
Efficient Estimation of Word Representations in Vector Space
Efficient Estimation of Word Representations in Vector Space
Tomas Mikolov
Kai Chen
G. Corrado
J. Dean
3DV
317
31,297
0
16 Jan 2013
1