ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1811.03436
20
3

Alpha-Integration Pooling for Convolutional Neural Networks

8 November 2018
O. Elbagalati
Mustafa Hajij
ArXivPDFHTML
Abstract

Convolutional neural networks (CNNs) have achieved remarkable performance in many applications, especially in image recognition tasks. As a crucial component of CNNs, sub-sampling plays an important role for efficient training or invariance property, and max-pooling and arithmetic average-pooling are commonly used sub-sampling methods. In addition to the two pooling methods, however, there could be many other pooling types, such as geometric average, harmonic average, and so on. Since it is not easy for algorithms to find the best pooling method, usually the pooling types are assumed a priority, which might not be optimal for different tasks. In line with the deep learning philosophy, the type of pooling can be driven by data for a given task. In this paper, we propose {\it α\alphaα-integration pooling} (α\alphaαI-pooling), which has a trainable parameter α\alphaα to find the type of pooling. α\alphaαI-pooling is a general pooling method including max-pooling and arithmetic average-pooling as a special case, depending on the parameter α\alphaα. Experiments show that α\alphaαI-pooling outperforms other pooling methods including max-pooling, in image recognition tasks. Also, it turns out that each layer has different optimal pooling type.

View on arXiv
Comments on this paper