Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2106.03743
Cited By
Proxy-Normalizing Activations to Match Batch Normalization while Removing Batch Dependence
7 June 2021
A. Labatie
Dominic Masters
Zach Eaton-Rosen
Carlo Luschi
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Proxy-Normalizing Activations to Match Batch Normalization while Removing Batch Dependence"
6 / 6 papers shown
Title
Understanding the Generalization Benefit of Normalization Layers: Sharpness Reduction
Kaifeng Lyu
Zhiyuan Li
Sanjeev Arora
FAtt
35
69
0
14 Jun 2022
On the Pitfalls of Batch Normalization for End-to-End Video Learning: A Study on Surgical Workflow Analysis
Dominik Rivoir
Isabel Funke
Stefanie Speidel
19
15
0
15 Mar 2022
High-Performance Large-Scale Image Recognition Without Normalization
Andrew Brock
Soham De
Samuel L. Smith
Karen Simonyan
VLM
223
512
0
11 Feb 2021
Is Batch Norm unique? An empirical investigation and prescription to emulate the best properties of common normalizers without batch dependence
Vinay Rao
Jascha Narain Sohl-Dickstein
BDL
35
4
0
21 Oct 2020
Aggregated Residual Transformations for Deep Neural Networks
Saining Xie
Ross B. Girshick
Piotr Dollár
Z. Tu
Kaiming He
273
10,214
0
16 Nov 2016
A Learned Representation For Artistic Style
Vincent Dumoulin
Jonathon Shlens
M. Kudlur
GAN
210
1,156
0
24 Oct 2016
1