Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2112.12650
Cited By
Distilling the Knowledge of Romanian BERTs Using Multiple Teachers
23 December 2021
Andrei-Marius Avram
Darius Catrina
Dumitru-Clementin Cercel
Mihai Dascualu
Traian Rebedea
Vasile Puaics
Dan Tufics
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Distilling the Knowledge of Romanian BERTs Using Multiple Teachers"
5 / 5 papers shown
Title
Catch Me if You Search: When Contextual Web Search Results Affect the Detection of Hallucinations
Mahjabin Nahar
Eun-Ju Lee
Jin Won Park
Dongwon Lee
HILM
71
0
0
01 Apr 2025
RoMemes: A multimodal meme corpus for the Romanian language
Vasile Păiş
Sara Niţă
Alexandru-Iulius Jerpelea
Luca Pană
Eric Curea
GNN
22
1
0
20 Oct 2024
Big Bird: Transformers for Longer Sequences
Manzil Zaheer
Guru Guruganesh
Kumar Avinava Dubey
Joshua Ainslie
Chris Alberti
...
Philip Pham
Anirudh Ravula
Qifan Wang
Li Yang
Amr Ahmed
VLM
249
1,982
0
28 Jul 2020
Q-BERT: Hessian Based Ultra Low Precision Quantization of BERT
Sheng Shen
Zhen Dong
Jiayu Ye
Linjian Ma
Z. Yao
A. Gholami
Michael W. Mahoney
Kurt Keutzer
MQ
225
571
0
12 Sep 2019
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
294
6,927
0
20 Apr 2018
1