Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2304.01089
Cited By
RPTQ: Reorder-based Post-training Quantization for Large Language Models
3 April 2023
Zhihang Yuan
Lin Niu
Jia-Wen Liu
Wenyu Liu
Xinggang Wang
Yuzhang Shang
Guangyu Sun
Qiang Wu
Jiaxiang Wu
Bingzhe Wu
MQ
Re-assign community
ArXiv
PDF
HTML
Papers citing
"RPTQ: Reorder-based Post-training Quantization for Large Language Models"
3 / 53 papers shown
Title
FlexGen: High-Throughput Generative Inference of Large Language Models with a Single GPU
Ying Sheng
Lianmin Zheng
Binhang Yuan
Zhuohan Li
Max Ryabinin
...
Joseph E. Gonzalez
Percy Liang
Christopher Ré
Ion Stoica
Ce Zhang
144
365
0
13 Mar 2023
GLM-130B: An Open Bilingual Pre-trained Model
Aohan Zeng
Xiao Liu
Zhengxiao Du
Zihan Wang
Hanyu Lai
...
Jidong Zhai
Wenguang Chen
Peng-Zhen Zhang
Yuxiao Dong
Jie Tang
BDL
LRM
242
1,070
0
05 Oct 2022
PTQ-SL: Exploring the Sub-layerwise Post-training Quantization
Zhihang Yuan
Yiqi Chen
Chenhao Xue
Chenguang Zhang
Qiankun Wang
Guangyu Sun
MQ
9
3
0
15 Oct 2021
Previous
1
2