Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2410.16958
Cited By
Leaky ReLUs That Differ in Forward and Backward Pass Facilitate Activation Maximization in Deep Neural Networks
22 October 2024
C. Linse
Erhardt Barth
Thomas Martinetz
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Leaky ReLUs That Differ in Forward and Backward Pass Facilitate Activation Maximization in Deep Neural Networks"
2 / 2 papers shown
Title
Convolutional Neural Networks Do Work with Pre-Defined Filters
C. Linse
Erhardt Barth
T. Martinetz
86
5
0
27 Nov 2024
ImageNet Large Scale Visual Recognition Challenge
Olga Russakovsky
Jia Deng
Hao Su
J. Krause
S. Satheesh
...
A. Karpathy
A. Khosla
Michael S. Bernstein
Alexander C. Berg
Li Fei-Fei
VLM
ObjD
296
39,198
0
01 Sep 2014
1