ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.14021
  4. Cited By
LORD: Low Rank Decomposition Of Monolingual Code LLMs For One-Shot
  Compression

LORD: Low Rank Decomposition Of Monolingual Code LLMs For One-Shot Compression

25 September 2023
Ayush Kaushal
Tejas Vaidhya
Irina Rish
ArXivPDFHTML

Papers citing "LORD: Low Rank Decomposition Of Monolingual Code LLMs For One-Shot Compression"

2 / 2 papers shown
Title
ZeroQuant-V2: Exploring Post-training Quantization in LLMs from
  Comprehensive Study to Low Rank Compensation
ZeroQuant-V2: Exploring Post-training Quantization in LLMs from Comprehensive Study to Low Rank Compensation
Z. Yao
Xiaoxia Wu
Cheng-rong Li
Stephen Youn
Yuxiong He
MQ
63
56
0
15 Mar 2023
Pruning and Quantization for Deep Neural Network Acceleration: A Survey
Pruning and Quantization for Deep Neural Network Acceleration: A Survey
Tailin Liang
C. Glossner
Lei Wang
Shaobo Shi
Xiaotong Zhang
MQ
124
665
0
24 Jan 2021
1