ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.16958
  4. Cited By
Leaky ReLUs That Differ in Forward and Backward Pass Facilitate
  Activation Maximization in Deep Neural Networks

Leaky ReLUs That Differ in Forward and Backward Pass Facilitate Activation Maximization in Deep Neural Networks

22 October 2024
C. Linse
Erhardt Barth
Thomas Martinetz
ArXivPDFHTML

Papers citing "Leaky ReLUs That Differ in Forward and Backward Pass Facilitate Activation Maximization in Deep Neural Networks"

2 / 2 papers shown
Title
Convolutional Neural Networks Do Work with Pre-Defined Filters
Convolutional Neural Networks Do Work with Pre-Defined Filters
C. Linse
Erhardt Barth
T. Martinetz
86
5
0
27 Nov 2024
ImageNet Large Scale Visual Recognition Challenge
ImageNet Large Scale Visual Recognition Challenge
Olga Russakovsky
Jia Deng
Hao Su
J. Krause
S. Satheesh
...
A. Karpathy
A. Khosla
Michael S. Bernstein
Alexander C. Berg
Li Fei-Fei
VLM
ObjD
296
39,198
0
01 Sep 2014
1