ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2111.11616
11
0

Using mixup as regularization and tuning hyper-parameters for ResNets

23 November 2021
Venkata Bhanu Teja Pallakonda
ArXivPDFHTML
Abstract

While novel computer vision architectures are gaining traction, the impact of model architectures is often related to changes or exploring in training methods. Identity mapping-based architectures ResNets and DenseNets have promised path-breaking results in the image classification task and are go-to methods for even now if the data given is fairly limited. Considering the ease of training with limited resources this work revisits the ResNets and improves the ResNet50 \cite{resnets} by using mixup data-augmentation as regularization and tuning the hyper-parameters.

View on arXiv
Comments on this paper