ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2003.04286
12
15

Manifold Regularization for Locally Stable Deep Neural Networks

9 March 2020
Charles Jin
Martin Rinard
    AAML
ArXivPDFHTML
Abstract

We apply concepts from manifold regularization to develop new regularization techniques for training locally stable deep neural networks. Our regularizers are based on a sparsification of the graph Laplacian which holds with high probability when the data is sparse in high dimensions, as is common in deep learning. Empirically, our networks exhibit stability in a diverse set of perturbation models, including ℓ2\ell_2ℓ2​, ℓ∞\ell_\inftyℓ∞​, and Wasserstein-based perturbations; in particular, we achieve 40% adversarial accuracy on CIFAR-10 against an adaptive PGD attack using ℓ∞\ell_\inftyℓ∞​ perturbations of size ϵ=8/255\epsilon = 8/255ϵ=8/255, and state-of-the-art verified accuracy of 21% in the same perturbation model. Furthermore, our techniques are efficient, incurring overhead on par with two additional parallel forward passes through the network.

View on arXiv
Comments on this paper