ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1412.6830
  4. Cited By
Learning Activation Functions to Improve Deep Neural Networks

Learning Activation Functions to Improve Deep Neural Networks

21 December 2014
Forest Agostinelli
Matthew Hoffman
Peter Sadowski
Pierre Baldi
    ODL
ArXivPDFHTML

Papers citing "Learning Activation Functions to Improve Deep Neural Networks"

6 / 56 papers shown
Title
Deep Networks with Stochastic Depth
Deep Networks with Stochastic Depth
Gao Huang
Yu Sun
Zhuang Liu
Daniel Sedra
Kilian Q. Weinberger
59
2,336
0
30 Mar 2016
Normalization Propagation: A Parametric Technique for Removing Internal
  Covariate Shift in Deep Networks
Normalization Propagation: A Parametric Technique for Removing Internal Covariate Shift in Deep Networks
Devansh Arpit
Yingbo Zhou
Bhargava U. Kota
V. Govindaraju
19
126
0
04 Mar 2016
Cascaded Subpatch Networks for Effective CNNs
Cascaded Subpatch Networks for Effective CNNs
Xiaoheng Jiang
Yanwei Pang
Manli Sun
Xuelong Li
18
39
0
01 Mar 2016
Convolutional neural networks with low-rank regularization
Convolutional neural networks with low-rank regularization
Cheng Tai
Tong Xiao
Yi Zhang
Xiaogang Wang
E. Weinan
BDL
9
460
0
19 Nov 2015
Delving Deep into Rectifiers: Surpassing Human-Level Performance on
  ImageNet Classification
Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification
Kaiming He
Xinming Zhang
Shaoqing Ren
Jian Sun
VLM
29
18,455
0
06 Feb 2015
Improving neural networks by preventing co-adaptation of feature
  detectors
Improving neural networks by preventing co-adaptation of feature detectors
Geoffrey E. Hinton
Nitish Srivastava
A. Krizhevsky
Ilya Sutskever
Ruslan Salakhutdinov
VLM
266
7,638
0
03 Jul 2012
Previous
12