Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2106.08181
Cited By
v1
v2 (latest)
Direction is what you need: Improving Word Embedding Compression in Large Language Models
15 June 2021
Klaudia Bałazy
Mohammadreza Banaei
R. Lebret
Jacek Tabor
Karl Aberer
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Direction is what you need: Improving Word Embedding Compression in Large Language Models"
3 / 3 papers shown
Semantic Compression for Word and Sentence Embeddings using Discrete Wavelet Transform
Rana Aref Salama
Abdou Youssef
Mona Diab
140
0
0
31 Jul 2025
TensorSLM: Energy-efficient Embedding Compression of Sub-billion Parameter Language Models on Low-end Devices
Mingxue Xu
Y. Xu
Danilo Mandic
187
0
0
16 Jun 2025
Revisiting Offline Compression: Going Beyond Factorization-based Methods for Transformer Language Models
Findings (Findings), 2023
Mohammadreza Banaei
Klaudia Bałazy
Artur Kasymov
R. Lebret
Jacek Tabor
Karl Aberer
OffRL
154
1
0
08 Feb 2023
1
Page 1 of 1