ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1912.13256
23
3

Scalable NAS with Factorizable Architectural Parameters

31 December 2019
Lanfei Wang
Lingxi Xie
Tianyi Zhang
Jun Guo
Qi Tian
ArXivPDFHTML
Abstract

Neural Architecture Search (NAS) is an emerging topic in machine learning and computer vision. The fundamental ideology of NAS is using an automatic mechanism to replace manual designs for exploring powerful network architectures. One of the key factors of NAS is to scale-up the search space, e.g., increasing the number of operators, so that more possibilities are covered, but existing search algorithms often get lost in a large number of operators. For avoiding huge computing and competition among similar operators in the same pool, this paper presents a scalable algorithm by factorizing a large set of candidate operators into smaller subspaces. As a practical example, this allows us to search for effective activation functions along with the regular operators including convolution, pooling, skip-connect, etc. With a small increase in search costs and no extra costs in re-training, we find interesting architectures that were not explored before, and achieve state-of-the-art performance on CIFAR10 and ImageNet, two standard image classification benchmarks.

View on arXiv
Comments on this paper