Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1802.03646
Cited By
On the Universal Approximability and Complexity Bounds of Quantized ReLU Neural Networks
10 February 2018
Yukun Ding
Jinglan Liu
Jinjun Xiong
Yiyu Shi
MQ
Re-assign community
ArXiv
PDF
HTML
Papers citing
"On the Universal Approximability and Complexity Bounds of Quantized ReLU Neural Networks"
6 / 6 papers shown
Title
Seeking Interpretability and Explainability in Binary Activated Neural Networks
Benjamin Leblanc
Pascal Germain
FAtt
31
1
0
07 Sep 2022
Why Quantization Improves Generalization: NTK of Binary Weight Neural Networks
Kaiqi Zhang
Ming Yin
Yu-Xiang Wang
MQ
16
4
0
13 Jun 2022
Dynamic Group Convolution for Accelerating Convolutional Neural Networks
Z. Su
Linpu Fang
Wenxiong Kang
D. Hu
M. Pietikäinen
Li Liu
6
44
0
08 Jul 2020
The universal approximation power of finite-width deep ReLU networks
Dmytro Perekrestenko
Philipp Grohs
Dennis Elbrächter
Helmut Bölcskei
11
36
0
05 Jun 2018
PBGen: Partial Binarization of Deconvolution-Based Generators for Edge Intelligence
Jinglan Liu
Jiaxin Zhang
Yukun Ding
Xiaowei Xu
Meng-Long Jiang
Yiyu Shi
33
4
0
26 Feb 2018
Incremental Network Quantization: Towards Lossless CNNs with Low-Precision Weights
Aojun Zhou
Anbang Yao
Yiwen Guo
Lin Xu
Yurong Chen
MQ
316
1,047
0
10 Feb 2017
1