ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2212.10200
  4. Cited By
Redistribution of Weights and Activations for AdderNet Quantization

Redistribution of Weights and Activations for AdderNet Quantization

20 December 2022
Ying Nie
Kai Han
Haikang Diao
Chuanjian Liu
Enhua Wu
Yunhe Wang
    MQ
ArXivPDFHTML

Papers citing "Redistribution of Weights and Activations for AdderNet Quantization"

4 / 4 papers shown
Title
Post-Training Sparsity-Aware Quantization
Post-Training Sparsity-Aware Quantization
Gil Shomron
F. Gabbay
Samer Kurzum
U. Weiser
MQ
15
32
0
23 May 2021
GhostSR: Learning Ghost Features for Efficient Image Super-Resolution
GhostSR: Learning Ghost Features for Efficient Image Super-Resolution
Ying Nie
Kai Han
Zhenhua Liu
Chunjing Xu
Yunhe Wang
OOD
24
22
0
21 Jan 2021
ShiftAddNet: A Hardware-Inspired Deep Network
ShiftAddNet: A Hardware-Inspired Deep Network
Haoran You
Xiaohan Chen
Yongan Zhang
Chaojian Li
Sicheng Li
Zihao Liu
Zhangyang Wang
Yingyan Lin
OOD
MQ
47
75
0
24 Oct 2020
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision
  Applications
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
948
20,214
0
17 Apr 2017
1