ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1908.09162
11
2

Don't ignore Dropout in Fully Convolutional Networks

24 August 2019
T. Spilsbury
Paavo Camps
    SSeg
ArXivPDFHTML
Abstract

Data for Image segmentation models can be costly to obtain due to the precision required by human annotators. We run a series of experiments showing the effect of different kinds of Dropout training on the DeepLabv3+ Image segmentation model when trained using a small dataset. We find that when appropriate forms of Dropout are applied in the right place in the model architecture that non-insignificant improvement in Mean Intersection over Union (mIoU) score can be observed. In our best case, we find that applying Dropout scheduling in conjunction with SpatialDropout improves baseline mIoU from 0.49 to 0.59. This result shows that even where a model architecture makes extensive use of Batch Normalization, Dropout can still be an effective way of improving performance in low data situations.

View on arXiv
Comments on this paper