Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1604.02313
Cited By
Norm-preserving Orthogonal Permutation Linear Unit Activation Functions (OPLU)
8 April 2016
Artem Chernodub
D. Nowicki
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Norm-preserving Orthogonal Permutation Linear Unit Activation Functions (OPLU)"
10 / 10 papers shown
Title
Enhancing Certified Robustness via Block Reflector Orthogonal Layers and Logit Annealing Loss
Bo-Han Lai
Pin-Han Huang
Bo-Han Kung
Shang-Tse Chen
12
0
0
21 May 2025
Approximation theory for 1-Lipschitz ResNets
Davide Murari
Takashi Furuya
Carola-Bibiane Schönlieb
26
0
0
17 May 2025
Convolutional Neural Networks as 2-D systems
Dennis Gramlich
Patricia Pauli
C. Scherer
Frank Allgöwer
C. Ebenbauer
3DV
36
8
0
06 Mar 2023
Almost-Orthogonal Layers for Efficient General-Purpose Lipschitz Networks
Bernd Prach
Christoph H. Lampert
39
36
0
05 Aug 2022
A Simple Approach to Improve Single-Model Deep Uncertainty via Distance-Awareness
J. Liu
Shreyas Padhy
Jie Jessie Ren
Zi Lin
Yeming Wen
Ghassen Jerfel
Zachary Nado
Jasper Snoek
Dustin Tran
Balaji Lakshminarayanan
UQCV
BDL
26
48
0
01 May 2022
Approximation of Lipschitz Functions using Deep Spline Neural Networks
Sebastian Neumayer
Alexis Goujon
Pakshal Bohra
M. Unser
45
16
0
13 Apr 2022
Logical Activation Functions: Logit-space equivalents of Probabilistic Boolean Operators
S. Lowe
Robert C. Earle
Jason dÉon
Thomas Trappenberg
Sageev Oore
23
1
0
22 Oct 2021
Simple and Principled Uncertainty Estimation with Deterministic Deep Learning via Distance Awareness
Jeremiah Zhe Liu
Zi Lin
Shreyas Padhy
Dustin Tran
Tania Bedrax-Weiss
Balaji Lakshminarayanan
UQCV
BDL
37
437
0
17 Jun 2020
On orthogonality and learning recurrent networks with long term dependencies
Eugene Vorontsov
C. Trabelsi
Samuel Kadoury
C. Pal
ODL
41
239
0
31 Jan 2017
Efficient Orthogonal Parametrisation of Recurrent Neural Networks Using Householder Reflections
Zakaria Mhammedi
Andrew D. Hellicar
Ashfaqur Rahman
James Bailey
27
129
0
01 Dec 2016
1