ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.07666
14
0

ProARD: progressive adversarial robustness distillation: provide wide range of robust students

9 June 2025
Seyedhamidreza Mousavi
Seyedali Mousavi
Masoud Daneshtalab
    AAML
ArXiv (abs)PDFHTML
Main:9 Pages
10 Figures
Bibliography:5 Pages
1 Tables
Abstract

Adversarial Robustness Distillation (ARD) has emerged as an effective method to enhance the robustness of lightweight deep neural networks against adversarial attacks. Current ARD approaches have leveraged a large robust teacher network to train one robust lightweight student. However, due to the diverse range of edge devices and resource constraints, current approaches require training a new student network from scratch to meet specific constraints, leading to substantial computational costs and increased CO2 emissions. This paper proposes Progressive Adversarial Robustness Distillation (ProARD), enabling the efficient one-time training of a dynamic network that supports a diverse range of accurate and robust student networks without requiring retraining. We first make a dynamic deep neural network based on dynamic layers by encompassing variations in width, depth, and expansion in each design stage to support a wide range of architectures. Then, we consider the student network with the largest size as the dynamic teacher network. ProARD trains this dynamic network using a weight-sharing mechanism to jointly optimize the dynamic teacher network and its internal student networks. However, due to the high computational cost of calculating exact gradients for all the students within the dynamic network, a sampling mechanism is required to select a subset of students. We show that random student sampling in each iteration fails to produce accurate and robust students.

View on arXiv
@article{mousavi2025_2506.07666,
  title={ ProARD: progressive adversarial robustness distillation: provide wide range of robust students },
  author={ Seyedhamidreza Mousavi and Seyedali Mousavi and Masoud Daneshtalab },
  journal={arXiv preprint arXiv:2506.07666},
  year={ 2025 }
}
Comments on this paper