ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.13160
  4. Cited By
Neuron Merging: Compensating for Pruned Neurons

Neuron Merging: Compensating for Pruned Neurons

25 October 2020
Woojeong Kim
Suhyun Kim
Mincheol Park
Geonseok Jeon
ArXivPDFHTML

Papers citing "Neuron Merging: Compensating for Pruned Neurons"

6 / 6 papers shown
Title
NOLA: Compressing LoRA using Linear Combination of Random Basis
NOLA: Compressing LoRA using Linear Combination of Random Basis
Soroush Abbasi Koohpayegani
K. Navaneet
Parsa Nooralinejad
Soheil Kolouri
Hamed Pirsiavash
40
12
0
04 Oct 2023
Accurate Retraining-free Pruning for Pretrained Encoder-based Language
  Models
Accurate Retraining-free Pruning for Pretrained Encoder-based Language Models
Seungcheol Park
Ho-Jin Choi
U. Kang
VLM
40
5
0
07 Aug 2023
Data-Free Quantization via Mixed-Precision Compensation without Fine-Tuning
Data-Free Quantization via Mixed-Precision Compensation without Fine-Tuning
Jun Chen
Shipeng Bai
Tianxin Huang
Mengmeng Wang
Guanzhong Tian
Y. Liu
MQ
38
18
0
02 Jul 2023
Sharp asymptotics on the compression of two-layer neural networks
Sharp asymptotics on the compression of two-layer neural networks
Mohammad Hossein Amani
Simone Bombari
Marco Mondelli
Rattana Pukdee
Stefano Rini
MLT
27
0
0
17 May 2022
Automatic Neural Network Pruning that Efficiently Preserves the Model
  Accuracy
Automatic Neural Network Pruning that Efficiently Preserves the Model Accuracy
Thibault Castells
Seul-Ki Yeom
3DV
18
3
0
18 Nov 2021
RED++ : Data-Free Pruning of Deep Neural Networks via Input Splitting
  and Output Merging
RED++ : Data-Free Pruning of Deep Neural Networks via Input Splitting and Output Merging
Edouard Yvinec
Arnaud Dapogny
Matthieu Cord
Kévin Bailly
28
15
0
30 Sep 2021
1