Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2004.06499
Cited By
What's so special about BERT's layers? A closer look at the NLP pipeline in monolingual and multilingual models
14 April 2020
Wietse de Vries
Andreas van Cranenburgh
Malvina Nissim
MILM
SSeg
MoE
Re-assign community
ArXiv
PDF
HTML
Papers citing
"What's so special about BERT's layers? A closer look at the NLP pipeline in monolingual and multilingual models"
9 / 9 papers shown
Title
Natural Language Processing RELIES on Linguistics
Juri Opitz
Shira Wein
Nathan Schneider
AI4CE
44
7
0
09 May 2024
Understanding Domain Learning in Language Models Through Subpopulation Analysis
Zheng Zhao
Yftah Ziser
Shay B. Cohen
32
6
0
22 Oct 2022
A context-aware knowledge transferring strategy for CTC-based ASR
Keda Lu
Kuan-Yu Chen
15
14
0
12 Oct 2022
Feature Aggregation in Zero-Shot Cross-Lingual Transfer Using Multilingual BERT
Beiduo Chen
Wu Guo
Quan Liu
Kun Tao
29
1
0
17 May 2022
Mono vs Multilingual BERT for Hate Speech Detection and Text Classification: A Case Study in Marathi
Abhishek Velankar
H. Patil
Raviraj Joshi
28
31
0
19 Apr 2022
Multi-Level Contrastive Learning for Cross-Lingual Alignment
Beiduo Chen
Wu Guo
Bin Gu
Quan Liu
Yongchao Wang
18
5
0
26 Feb 2022
Not All Models Localize Linguistic Knowledge in the Same Place: A Layer-wise Probing on BERToids' Representations
Mohsen Fayyaz
Ehsan Aghazadeh
Ali Modarressi
Hosein Mohebbi
Mohammad Taher Pilehvar
18
21
0
13 Sep 2021
A Primer on Pretrained Multilingual Language Models
Sumanth Doddapaneni
Gowtham Ramesh
Mitesh M. Khapra
Anoop Kunchukuttan
Pratyush Kumar
LRM
43
73
0
01 Jul 2021
What the [MASK]? Making Sense of Language-Specific BERT Models
Debora Nozza
Federico Bianchi
Dirk Hovy
84
105
0
05 Mar 2020
1