ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1812.04210
  4. Cited By
A Main/Subsidiary Network Framework for Simplifying Binary Neural
  Network

A Main/Subsidiary Network Framework for Simplifying Binary Neural Network

11 December 2018
Yinghao Xu
Xin Dong
Yudian Li
Hao Su
ArXivPDFHTML

Papers citing "A Main/Subsidiary Network Framework for Simplifying Binary Neural Network"

5 / 5 papers shown
Title
Optimizing data-flow in Binary Neural Networks
Optimizing data-flow in Binary Neural Networks
Lorenzo Vorabbi
Davide Maltoni
Stefano Santi
MQ
11
5
0
03 Apr 2023
CompConv: A Compact Convolution Module for Efficient Feature Learning
CompConv: A Compact Convolution Module for Efficient Feature Learning
Chen Zhang
Yinghao Xu
Yujun Shen
VLM
SSL
8
10
0
19 Jun 2021
A Winning Hand: Compressing Deep Networks Can Improve
  Out-Of-Distribution Robustness
A Winning Hand: Compressing Deep Networks Can Improve Out-Of-Distribution Robustness
James Diffenderfer
Brian Bartoldson
Shreya Chaganti
Jize Zhang
B. Kailkhura
OOD
26
69
0
16 Jun 2021
Binary Neural Networks: A Survey
Binary Neural Networks: A Survey
Haotong Qin
Ruihao Gong
Xianglong Liu
Xiao Bai
Jingkuan Song
N. Sebe
MQ
50
457
0
31 Mar 2020
Incremental Network Quantization: Towards Lossless CNNs with
  Low-Precision Weights
Incremental Network Quantization: Towards Lossless CNNs with Low-Precision Weights
Aojun Zhou
Anbang Yao
Yiwen Guo
Lin Xu
Yurong Chen
MQ
319
1,049
0
10 Feb 2017
1