Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2010.01791
Cited By
Pruning Redundant Mappings in Transformer Models via Spectral-Normalized Identity Prior
5 October 2020
Zi Lin
Jeremiah Zhe Liu
Ziao Yang
Nan Hua
Dan Roth
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Pruning Redundant Mappings in Transformer Models via Spectral-Normalized Identity Prior"
5 / 5 papers shown
Title
Accurate Retraining-free Pruning for Pretrained Encoder-based Language Models
Seungcheol Park
Ho-Jin Choi
U. Kang
VLM
25
5
0
07 Aug 2023
Learning a Consensus Sub-Network with Polarization Regularization and One Pass Training
Xiaoying Zhi
Varun Babbar
P. Sun
Fran Silavong
Ruibo Shi
Sean J. Moran
Sean Moran
31
1
0
17 Feb 2023
From Dense to Sparse: Contrastive Pruning for Better Pre-trained Language Model Compression
Runxin Xu
Fuli Luo
Chengyu Wang
Baobao Chang
Jun Huang
Songfang Huang
Fei Huang
VLM
27
25
0
14 Dec 2021
Q-BERT: Hessian Based Ultra Low Precision Quantization of BERT
Sheng Shen
Zhen Dong
Jiayu Ye
Linjian Ma
Z. Yao
A. Gholami
Michael W. Mahoney
Kurt Keutzer
MQ
225
575
0
12 Sep 2019
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,950
0
20 Apr 2018
1