Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2004.04010
Cited By
v1
v2 (latest)
Analyzing Redundancy in Pretrained Transformer Models
8 April 2020
Fahim Dalvi
Hassan Sajjad
Nadir Durrani
Yonatan Belinkov
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Analyzing Redundancy in Pretrained Transformer Models"
3 / 3 papers shown
Interpreting the Effects of Quantization on LLMs
Manpreet Singh
Hassan Sajjad
MQ
MILM
463
3
0
22 Aug 2025
Silence is Sweeter Than Speech: Self-Supervised Model Using Silence to Store Speaker Information
Chiyu Feng
Po-Chun Hsu
Hung-yi Lee
SSL
190
11
0
08 May 2022
The Parallel Meaning Bank: A Framework for Semantically Annotating Multiple Languages
Lasha Abzianidze
Rik van Noord
Chunliu Wang
Johan Bos
238
6
0
29 Dec 2020
1
Page 1 of 1