15
2

LogAvgExp Provides a Principled and Performant Global Pooling Operator

Abstract

We seek to improve the pooling operation in neural networks, by applying a more theoretically justified operator. We demonstrate that LogSumExp provides a natural OR operator for logits. When one corrects for the number of elements inside the pooling operator, this becomes LogAvgExp:=log(mean(exp(x)))\text{LogAvgExp} := \log(\text{mean}(\exp(x))). By introducing a single temperature parameter, LogAvgExp smoothly transitions from the max of its operands to the mean (found at the limiting cases t0+t \to 0^+ and t+t \to +\infty). We experimentally tested LogAvgExp, both with and without a learnable temperature parameter, in a variety of deep neural network architectures for computer vision.

View on arXiv
Comments on this paper