282

L2L_2BN: Enhancing Batch Normalization by Equalizing the L2L_2 Norms of Features

Abstract

In this paper, we show that the difference in l2l_2 norms of sample features can hinder batch normalization from obtaining more distinguished inter-class features and more compact intra-class features. To address this issue, we propose an intuitive but effective method to equalize the l2l_2 norms of sample features. Concretely, we l2l_2-normalize each sample feature before batch normalization, and therefore the features are of the same magnitude. Since the proposed method combines the l2l_2 normalization and batch normalization, we name our method L2L_2BN. The L2L_2BN can strengthen the compactness of intra-class features and enlarge the discrepancy of inter-class features. The L2L_2BN is easy to implement and can exert its effect without any additional parameters and hyper-parameters. Therefore, it can be used as a basic normalization method for neural networks. We evaluate the effectiveness of L2L_2BN through extensive experiments with various models on image classification and acoustic scene classification tasks. The experimental results demonstrate that the L2L_2BN can boost the generalization ability of various neural network models and achieve considerable performance improvements.

View on arXiv
Comments on this paper