ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.04060
  4. Cited By
On Approximation Capabilities of ReLU Activation and Softmax Output
  Layer in Neural Networks

On Approximation Capabilities of ReLU Activation and Softmax Output Layer in Neural Networks

10 February 2020
Behnam Asadi
Hui Jiang
ArXivPDFHTML

Papers citing "On Approximation Capabilities of ReLU Activation and Softmax Output Layer in Neural Networks"

3 / 3 papers shown
Title
Neural-g: A Deep Learning Framework for Mixing Density Estimation
Neural-g: A Deep Learning Framework for Mixing Density Estimation
Shijie Wang
Saptarshi Chakraborty
Qian Qin
Ray Bai
BDL
35
0
0
10 Jun 2024
A Latent Space Theory for Emergent Abilities in Large Language Models
A Latent Space Theory for Emergent Abilities in Large Language Models
Hui Jiang
LRM
25
35
0
19 Apr 2023
S++: A Fast and Deployable Secure-Computation Framework for
  Privacy-Preserving Neural Network Training
S++: A Fast and Deployable Secure-Computation Framework for Privacy-Preserving Neural Network Training
Prashanthi Ramachandran
Shivam Agarwal
A. Mondal
Aastha Shah
Debayan Gupta
FedML
11
8
0
28 Jan 2021
1