ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.17233
  4. Cited By
CLAQ: Pushing the Limits of Low-Bit Post-Training Quantization for LLMs
v1v2 (latest)

CLAQ: Pushing the Limits of Low-Bit Post-Training Quantization for LLMs

27 May 2024
Haoyu Wang
Bei Liu
Hang Shao
Bo Xiao
Ke Zeng
Guanglu Wan
Yanmin Qian
    MQ
ArXiv (abs)PDFHTML

Papers citing "CLAQ: Pushing the Limits of Low-Bit Post-Training Quantization for LLMs"

2 / 2 papers shown
Title
BTC-LLM: Efficient Sub-1-Bit LLM Quantization via Learnable Transformation and Binary Codebook
BTC-LLM: Efficient Sub-1-Bit LLM Quantization via Learnable Transformation and Binary Codebook
Hao Gu
Lujun Li
Zheyu Wang
B. Liu
Qiyuan Zhu
Sirui Han
Wenhan Luo
MQ
105
1
0
24 May 2025
Yi: Open Foundation Models by 01.AI
Yi: Open Foundation Models by 01.AI
01. AI
Alex Young
01.AI Alex Young
Bei Chen
Chao Li
...
Yue Wang
Yuxuan Cai
Zhenyu Gu
Zhiyuan Liu
Zonghong Dai
OSLMLRM
749
753
0
07 Mar 2024
1