ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2507.04671
144
1
v1v2 (latest)

DANCE: Resource-Efficient Neural Architecture Search with Data-Aware and Continuous Adaptation

7 July 2025
Xinjian Zhao
Tianshuo Wei
Sheng Zhang
Ruocheng Guo
Wanyu Wang
Shanshan Ye
Lixin Zou
Xuetao Wei
Xiangyu Zhao
    TTA
ArXiv (abs)PDFHTML
Main:7 Pages
5 Figures
Bibliography:2 Pages
3 Tables
Appendix:2 Pages
Abstract

Neural Architecture Search (NAS) has emerged as a powerful approach for automating neural network design. However, existing NAS methods face critical limitations in real-world deployments: architectures lack adaptability across scenarios, each deployment context requires costly separate searches, and performance consistency across diverse platforms remains challenging. We propose DANCE (Dynamic Architectures with Neural Continuous Evolution), which reformulates architecture search as a continuous evolution problem through learning distributions over architectural components. DANCE introduces three key innovations: a continuous architecture distribution enabling smooth adaptation, a unified architecture space with learned selection gates for efficient sampling, and a multi-stage training strategy for effective deployment optimization. Extensive experiments across five datasets demonstrate DANCE's effectiveness. Our method consistently outperforms state-of-the-art NAS approaches in terms of accuracy while significantly reducing search costs. Under varying computational constraints, DANCE maintains robust performance while smoothly adapting architectures to different hardware requirements. The code and appendix can be found at this https URL.

View on arXiv
Comments on this paper