Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2407.08044
Cited By
RoLoRA: Fine-tuning Rotated Outlier-free LLMs for Effective Weight-Activation Quantization
10 July 2024
Xijie Huang
Zechun Liu
Shih-yang Liu
Kwang-Ting Cheng
MQ
Re-assign community
ArXiv
PDF
HTML
Papers citing
"RoLoRA: Fine-tuning Rotated Outlier-free LLMs for Effective Weight-Activation Quantization"
3 / 3 papers shown
Title
Fast and Low-Cost Genomic Foundation Models via Outlier Removal
Haozheng Luo
Chenghao Qiu
Maojiang Su
Zhihan Zhou
Zoe Mehta
Guo Ye
Jerry Yao-Chieh Hu
Han Liu
AAML
53
0
0
01 May 2025
EoRA: Training-free Compensation for Compressed LLM with Eigenspace Low-Rank Approximation
Shih-yang Liu
Huck Yang
Nai Chit Fung
Nai Chit Fung
Hongxu Yin
...
Jan Kautz
Yu-Chun Wang
Pavlo Molchanov
Min-Hung Chen
Min-Hung Chen
MQ
29
0
0
28 Oct 2024
SliceGPT: Compress Large Language Models by Deleting Rows and Columns
Saleh Ashkboos
Maximilian L. Croci
Marcelo Gennari do Nascimento
Torsten Hoefler
James Hensman
VLM
122
143
0
26 Jan 2024
1