Adaptive Precision Training: Quantify Back Propagation in Neural Networks with Fixed-point Numbers
Xishan Zhang
Shaoli Liu
Rui Zhang
Chang-Shu Liu
Di Huang
Shiyi Zhou
Jiaming Guo
Yu Kang
Qi Guo
Zidong Du
Yunji Chen
Abstract
Adaptive Precision Training: Quantify Back Propagation in Neural Networks with Fixed-point Numbers. Recent emerged quantization technique has been applied to inference of deep neural networks for fast and efficient execution. However, directly applying quantization in training can cause significant accuracy loss, thus remaining an open challenge.
View on arXivComments on this paper