ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1806.01935
17
6

Exploring Feature Reuse in DenseNet Architectures

5 June 2018
Andy Hess
ArXivPDFHTML
Abstract

Densely Connected Convolutional Networks (DenseNets) have been shown to achieve state-of-the-art results on image classification tasks while using fewer parameters and computation than competing methods. Since each layer in this architecture has full access to the feature maps of all previous layers, the network is freed from the burden of having to relearn previously useful features, thus alleviating issues with vanishing gradients. In this work we explore the question: To what extent is it necessary to connect to all previous layers in order to reap the benefits of feature reuse? To this end, we introduce the notion of local dense connectivity and present evidence that less connectivity, allowing for increased growth rate at a fixed network capacity, can achieve a more efficient reuse of features and lead to higher accuracy in dense architectures.

View on arXiv
Comments on this paper