ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.13212
12
11

Memory-Efficient Hierarchical Neural Architecture Search for Image Restoration

24 December 2020
Haokui Zhang
Ying Li
Hao Chen
Chengrong Gong
Zongwen Bai
Chunhua Shen
ArXivPDFHTML
Abstract

Recently, much attention has been spent on neural architecture search (NAS), aiming to outperform those manually-designed neural architectures on high-level vision recognition tasks. Inspired by the success, here we attempt to leverage NAS techniques to automatically design efficient network architectures for low-level image restoration tasks. In particular, we propose a memory-efficient hierarchical NAS (termed HiNAS) and apply it to two such tasks: image denoising and image super-resolution. HiNAS adopts gradient based search strategies and builds a flexible hierarchical search space, including the inner search space and outer search space. They are in charge of designing cell architectures and deciding cell widths, respectively. For the inner search space, we propose a layer-wise architecture sharing strategy (LWAS), resulting in more flexible architectures and better performance. For the outer search space, we design a cell-sharing strategy to save memory, and considerably accelerate the search speed. The proposed HiNAS method is both memory and computation efficient. With a single GTX1080Ti GPU, it takes only about 1 hour for searching for denoising network on the BSD-500 dataset and 3.5 hours for searching for the super-resolution structure on the DIV2K dataset. Experiments show that the architectures found by HiNAS have fewer parameters and enjoy a faster inference speed, while achieving highly competitive performance compared with state-of-the-art methods. Code is available at: https://github.com/hkzhang91/HiNAS

View on arXiv
Comments on this paper