BN: Enhancing Batch Normalization by Equalizing the Norms of
Features
In this paper, we show that the difference in norms of sample features can hinder batch normalization from obtaining more distinguished inter-class features and more compact intra-class features. To address this issue, we propose an intuitive but effective method to equalize the norms of sample features. Concretely, we -normalize each sample feature before batch normalization, and therefore the features are of the same magnitude. Since the proposed method combines the normalization and batch normalization, we name our method BN. The BN can strengthen the compactness of intra-class features and enlarge the discrepancy of inter-class features. The BN is easy to implement and can exert its effect without any additional parameters and hyper-parameters. Therefore, it can be used as a basic normalization method for neural networks. We evaluate the effectiveness of BN through extensive experiments with various models on image classification and acoustic scene classification tasks. The experimental results demonstrate that the BN can boost the generalization ability of various neural network models and achieve considerable performance improvements.
View on arXiv