Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2405.17233
Cited By
v1
v2 (latest)
CLAQ: Pushing the Limits of Low-Bit Post-Training Quantization for LLMs
27 May 2024
Haoyu Wang
Bei Liu
Hang Shao
Bo Xiao
Ke Zeng
Guanglu Wan
Yanmin Qian
MQ
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"CLAQ: Pushing the Limits of Low-Bit Post-Training Quantization for LLMs"
2 / 2 papers shown
Title
BTC-LLM: Efficient Sub-1-Bit LLM Quantization via Learnable Transformation and Binary Codebook
Hao Gu
Lujun Li
Zheyu Wang
B. Liu
Qiyuan Zhu
Sirui Han
Wenhan Luo
MQ
105
1
0
24 May 2025
Yi: Open Foundation Models by 01.AI
01. AI
Alex Young
01.AI Alex Young
Bei Chen
Chao Li
...
Yue Wang
Yuxuan Cai
Zhenyu Gu
Zhiyuan Liu
Zonghong Dai
OSLM
LRM
749
753
0
07 Mar 2024
1