ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2311.16833
  4. Cited By
1-Lipschitz Layers Compared: Memory, Speed, and Certifiable Robustness

1-Lipschitz Layers Compared: Memory, Speed, and Certifiable Robustness

28 November 2023
Bernd Prach
Fabio Brau
Giorgio Buttazzo
Christoph H. Lampert
ArXivPDFHTML

Papers citing "1-Lipschitz Layers Compared: Memory, Speed, and Certifiable Robustness"

5 / 5 papers shown
Title
1-Lipschitz Neural Networks are more expressive with N-Activations
1-Lipschitz Neural Networks are more expressive with N-Activations
Bernd Prach
Christoph H. Lampert
AAML
FAtt
24
0
0
10 Nov 2023
Robust-by-Design Classification via Unitary-Gradient Neural Networks
Robust-by-Design Classification via Unitary-Gradient Neural Networks
Fabio Brau
Giulio Rossolini
Alessandro Biondi
Giorgio Buttazzo
AAML
34
5
0
09 Sep 2022
Globally-Robust Neural Networks
Globally-Robust Neural Networks
Klas Leino
Zifan Wang
Matt Fredrikson
AAML
OOD
80
125
0
16 Feb 2021
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train
  10,000-Layer Vanilla Convolutional Neural Networks
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks
Lechao Xiao
Yasaman Bahri
Jascha Narain Sohl-Dickstein
S. Schoenholz
Jeffrey Pennington
220
347
0
14 Jun 2018
Reluplex: An Efficient SMT Solver for Verifying Deep Neural Networks
Reluplex: An Efficient SMT Solver for Verifying Deep Neural Networks
Guy Katz
Clark W. Barrett
D. Dill
Kyle D. Julian
Mykel Kochenderfer
AAML
222
1,832
0
03 Feb 2017
1