Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2105.07741
Cited By
Activation function design for deep networks: linearity and effective initialisation
17 May 2021
Michael Murray
V. Abrol
Jared Tanner
ODL
LLMSV
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Activation function design for deep networks: linearity and effective initialisation"
6 / 6 papers shown
Title
On the Initialisation of Wide Low-Rank Feedforward Neural Networks
Thiziri Nait Saada
Jared Tanner
13
1
0
31 Jan 2023
Expected Gradients of Maxout Networks and Consequences to Parameter Initialization
Hanna Tseran
Guido Montúfar
ODL
16
0
0
17 Jan 2023
Characterizing the Spectrum of the NTK via a Power Series Expansion
Michael Murray
Hui Jin
Benjamin Bowman
Guido Montúfar
30
11
0
15 Nov 2022
Wide and Deep Neural Networks Achieve Optimality for Classification
Adityanarayanan Radhakrishnan
M. Belkin
Caroline Uhler
14
18
0
29 Apr 2022
Coordinate descent on the orthogonal group for recurrent neural network training
E. Massart
V. Abrol
29
10
0
30 Jul 2021
TanhExp: A Smooth Activation Function with High Convergence Speed for Lightweight Neural Networks
Xinyu Liu
Xiaoguang Di
19
59
0
22 Mar 2020
1