This paper studies fast adversarial training against sparse adversarial perturbations bounded by norm. We demonstrate the challenges of employing -step attacks on bounded perturbations for fast adversarial training, including degraded performance and the occurrence of catastrophic overfitting (CO). We highlight that CO in adversarial training is caused by sub-optimal perturbation locations of -step attack. Theoretical and empirical analyses reveal that the loss landscape of adversarial training is more craggy compared to its , and counterparts. Moreover, we corroborate that the craggy loss landscape can aggravate CO. To address these issues, we propose Fast-LS- that incorporates soft labels and the trade-off loss function to smooth the adversarial loss landscape. Extensive experiments demonstrate our method can overcome the challenge of catastrophic overfitting, achieve state-of-the-art performance, and narrow down the performance gap between -step and multi-step adversarial training against sparse attacks.
View on arXiv@article{zhong2025_2502.21041, title={ Fast Adversarial Training against Sparse Attacks Requires Loss Smoothing }, author={ Xuyang Zhong and Yixiao Huang and Chen Liu }, journal={arXiv preprint arXiv:2502.21041}, year={ 2025 } }