ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2408.00923
  4. Cited By
Reclaiming Residual Knowledge: A Novel Paradigm to Low-Bit Quantization

Reclaiming Residual Knowledge: A Novel Paradigm to Low-Bit Quantization

1 August 2024
Róisín Luo
Alexandru Drimbarean
Walsh Simon
Colm O'Riordan
    MQ
ArXivPDFHTML

Papers citing "Reclaiming Residual Knowledge: A Novel Paradigm to Low-Bit Quantization"

3 / 3 papers shown
Title
Convolution Meets LoRA: Parameter Efficient Finetuning for Segment
  Anything Model
Convolution Meets LoRA: Parameter Efficient Finetuning for Segment Anything Model
Zihan Zhong
Zhiqiang Tang
Tong He
Haoyang Fang
Chun Yuan
33
40
0
31 Jan 2024
Decoupled Dynamic Filter Networks
Decoupled Dynamic Filter Networks
Jingkai Zhou
Varun Jampani
Zhixiong Pi
Qiong Liu
Ming-Hsuan Yang
43
107
0
29 Apr 2021
Pruning and Quantization for Deep Neural Network Acceleration: A Survey
Pruning and Quantization for Deep Neural Network Acceleration: A Survey
Tailin Liang
C. Glossner
Lei Wang
Shaobo Shi
Xiaotong Zhang
MQ
124
665
0
24 Jan 2021
1