Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2003.09855
Cited By
TanhExp: A Smooth Activation Function with High Convergence Speed for Lightweight Neural Networks
22 March 2020
Xinyu Liu
Xiaoguang Di
Re-assign community
ArXiv
PDF
HTML
Papers citing
"TanhExp: A Smooth Activation Function with High Convergence Speed for Lightweight Neural Networks"
7 / 7 papers shown
Title
Neural Density-Distance Fields
Itsuki Ueda
Yoshihiro Fukuhara
Hirokatsu Kataoka
Hiroaki Aizawa
Hidehiko Shishido
I. Kitahara
29
15
0
29 Jul 2022
Activation Functions: Dive into an optimal activation function
V. Bansal
FAtt
16
2
0
24 Feb 2022
Real World Large Scale Recommendation Systems Reproducibility and Smooth Activations
G. Shamir
Dong Lin
HAI
OffRL
23
6
0
14 Feb 2022
Activation function design for deep networks: linearity and effective initialisation
Michael Murray
V. Abrol
Jared Tanner
ODL
LLMSV
15
18
0
17 May 2021
Compacting Deep Neural Networks for Internet of Things: Methods and Applications
Ke Zhang
Hanbo Ying
Hongning Dai
Lin Li
Yuangyuang Peng
Keyi Guo
Hongfang Yu
16
38
0
20 Mar 2021
Smooth activations and reproducibility in deep networks
G. Shamir
Dong Lin
Lorenzo Coviello
6
22
0
20 Oct 2020
TanhSoft -- a family of activation functions combining Tanh and Softplus
Koushik Biswas
Sandeep Kumar
Shilpak Banerjee
A. Pandey
12
5
0
08 Sep 2020
1