Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2110.04711
Cited By
SuperShaper: Task-Agnostic Super Pre-training of BERT Models with Variable Hidden Dimensions
10 October 2021
Vinod Ganesan
Gowtham Ramesh
Pratyush Kumar
Re-assign community
ArXiv
PDF
HTML
Papers citing
"SuperShaper: Task-Agnostic Super Pre-training of BERT Models with Variable Hidden Dimensions"
7 / 7 papers shown
Title
An investigation on the use of Large Language Models for hyperparameter tuning in Evolutionary Algorithms
Leonardo Lucio Custode
Fabio Caraffini
Anil Yaman
Giovanni Iacca
35
2
0
05 Aug 2024
A Comprehensive Analysis of Adapter Efficiency
Nandini Mundra
Sumanth Doddapaneni
Raj Dabre
Anoop Kunchukuttan
Ratish Puduppully
Mitesh M. Khapra
18
10
0
12 May 2023
The Lottery Ticket Hypothesis for Pre-trained BERT Networks
Tianlong Chen
Jonathan Frankle
Shiyu Chang
Sijia Liu
Yang Zhang
Zhangyang Wang
Michael Carbin
148
345
0
23 Jul 2020
BERT-of-Theseus: Compressing BERT by Progressive Module Replacing
Canwen Xu
Wangchunshu Zhou
Tao Ge
Furu Wei
Ming Zhou
221
197
0
07 Feb 2020
Q-BERT: Hessian Based Ultra Low Precision Quantization of BERT
Sheng Shen
Zhen Dong
Jiayu Ye
Linjian Ma
Z. Yao
A. Gholami
Michael W. Mahoney
Kurt Keutzer
MQ
225
574
0
12 Sep 2019
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
294
6,950
0
20 Apr 2018
Neural Architecture Search with Reinforcement Learning
Barret Zoph
Quoc V. Le
264
5,326
0
05 Nov 2016
1