Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2311.01689
Cited By
Data-Free Distillation of Language Model by Text-to-Text Transfer
3 November 2023
Zheyuan Bai
Xinduo Liu
Hailin Hu
Tianyu Guo
Qinghua Zhang
Yunhe Wang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Data-Free Distillation of Language Model by Text-to-Text Transfer"
3 / 3 papers shown
Title
A Survey on Transformer Compression
Yehui Tang
Yunhe Wang
Jianyuan Guo
Zhijun Tu
Kai Han
Hailin Hu
Dacheng Tao
31
27
0
05 Feb 2024
Training language models to follow instructions with human feedback
Long Ouyang
Jeff Wu
Xu Jiang
Diogo Almeida
Carroll L. Wainwright
...
Amanda Askell
Peter Welinder
Paul Christiano
Jan Leike
Ryan J. Lowe
OSLM
ALM
311
11,915
0
04 Mar 2022
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
280
3,844
0
18 Apr 2021
1