Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2307.05972
Cited By
Self-Distilled Quantization: Achieving High Compression Rates in Transformer-Based Language Models
Annual Meeting of the Association for Computational Linguistics (ACL), 2023
12 July 2023
James OÑeill
Sourav Dutta
VLM
MQ
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Self-Distilled Quantization: Achieving High Compression Rates in Transformer-Based Language Models"
0 / 0 papers shown
No papers found