Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2108.11299
Cited By
Certifiers Make Neural Networks Vulnerable to Availability Attacks
25 August 2021
Tobias Lorenz
Marta Kwiatkowska
Mario Fritz
AAML
SILM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Certifiers Make Neural Networks Vulnerable to Availability Attacks"
5 / 5 papers shown
Title
Globally-Robust Neural Networks
Klas Leino
Zifan Wang
Matt Fredrikson
AAML
OOD
80
126
0
16 Feb 2021
Deep Continuous Fusion for Multi-Sensor 3D Object Detection
Ming Liang
Binh Yang
Shenlong Wang
R. Urtasun
3DPC
208
841
0
20 Dec 2020
CNN-Cert: An Efficient Framework for Certifying Robustness of Convolutional Neural Networks
Akhilan Boopathy
Tsui-Wei Weng
Pin-Yu Chen
Sijia Liu
Luca Daniel
AAML
108
138
0
29 Nov 2018
Reluplex: An Efficient SMT Solver for Verifying Deep Neural Networks
Guy Katz
Clark W. Barrett
D. Dill
Kyle D. Julian
Mykel Kochenderfer
AAML
251
1,842
0
03 Feb 2017
Safety Verification of Deep Neural Networks
Xiaowei Huang
Marta Kwiatkowska
Sen Wang
Min Wu
AAML
183
933
0
21 Oct 2016
1