Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2002.04060
Cited By
On Approximation Capabilities of ReLU Activation and Softmax Output Layer in Neural Networks
10 February 2020
Behnam Asadi
Hui Jiang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"On Approximation Capabilities of ReLU Activation and Softmax Output Layer in Neural Networks"
3 / 3 papers shown
Title
Neural-g: A Deep Learning Framework for Mixing Density Estimation
Shijie Wang
Saptarshi Chakraborty
Qian Qin
Ray Bai
BDL
35
0
0
10 Jun 2024
A Latent Space Theory for Emergent Abilities in Large Language Models
Hui Jiang
LRM
25
35
0
19 Apr 2023
S++: A Fast and Deployable Secure-Computation Framework for Privacy-Preserving Neural Network Training
Prashanthi Ramachandran
Shivam Agarwal
A. Mondal
Aastha Shah
Debayan Gupta
FedML
11
8
0
28 Jan 2021
1