Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2410.16454
Cited By
Catastrophic Failure of LLM Unlearning via Quantization
21 October 2024
Zhiwei Zhang
Fali Wang
Xiaomin Li
Zongyu Wu
Xianfeng Tang
Hui Liu
Qi He
Wenpeng Yin
Suhang Wang
MU
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Catastrophic Failure of LLM Unlearning via Quantization"
1 / 1 papers shown
Title
Model Tampering Attacks Enable More Rigorous Evaluations of LLM Capabilities
Zora Che
Stephen Casper
Robert Kirk
Anirudh Satheesh
Stewart Slocum
...
Zikui Cai
Bilal Chughtai
Y. Gal
Furong Huang
Dylan Hadfield-Menell
MU
AAML
ELM
66
2
0
03 Feb 2025
1