ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1809.09534
  4. Cited By
PLU: The Piecewise Linear Unit Activation Function

PLU: The Piecewise Linear Unit Activation Function

3 September 2018
Andrei Nicolae
ArXiv (abs)PDFHTML

Papers citing "PLU: The Piecewise Linear Unit Activation Function"

12 / 12 papers shown
Exploring the Relationship: Transformative Adaptive Activation Functions
  in Comparison to Other Activation Functions
Exploring the Relationship: Transformative Adaptive Activation Functions in Comparison to Other Activation Functions
Vladimír Kunc
443
3
0
14 Feb 2024
Optimizing Performance of Feedforward and Convolutional Neural Networks
  through Dynamic Activation Functions
Optimizing Performance of Feedforward and Convolutional Neural Networks through Dynamic Activation Functions
Chinmay Rane
Kanishka Tyagi
M. Manry
AI4CEODL
181
4
0
10 Aug 2023
Optimal Activation Functions for the Random Features Regression Model
Optimal Activation Functions for the Random Features Regression ModelInternational Conference on Learning Representations (ICLR), 2022
Jianxin Wang
José Bento
336
4
0
31 May 2022
On the Omnipresence of Spurious Local Minima in Certain Neural Network
  Training Problems
On the Omnipresence of Spurious Local Minima in Certain Neural Network Training ProblemsConstructive approximation (Constr. Approx.), 2022
C. Christof
Julia Kowalczyk
401
10
0
23 Feb 2022
Activation Functions in Deep Learning: A Comprehensive Survey and
  Benchmark
Activation Functions in Deep Learning: A Comprehensive Survey and Benchmark
S. Dubey
S. Singh
B. B. Chaudhuri
495
1,042
0
29 Sep 2021
Piecewise Linear Units Improve Deep Neural Networks
Piecewise Linear Units Improve Deep Neural Networks
Jordan Inturrisi
Suiyang Khoo
Abbas Kouzani
Riccardo M. Pagliarella
333
4
0
02 Aug 2021
Objective Metrics to Evaluate Residual-Echo Suppression During
  Double-Talk
Objective Metrics to Evaluate Residual-Echo Suppression During Double-TalkIEEE Workshop on Applications of Signal Processing to Audio and Acoustics (WASPAA), 2021
Amir Ivry
Israel Cohen
B. Berdugo
203
8
0
15 Jul 2021
Nonlinear Acoustic Echo Cancellation with Deep Learning
Nonlinear Acoustic Echo Cancellation with Deep LearningInterspeech (Interspeech), 2021
Amir Ivry
Israel Cohen
B. Berdugo
217
23
0
25 Jun 2021
Self-Supervised Nonlinear Transform-Based Tensor Nuclear Norm for
  Multi-Dimensional Image Recovery
Self-Supervised Nonlinear Transform-Based Tensor Nuclear Norm for Multi-Dimensional Image RecoveryIEEE Transactions on Image Processing (TIP), 2021
Yi-Si Luo
Xile Zhao
Tai-Xiang Jiang
Yi Chang
Michael K. Ng
Chao Li
212
77
0
29 May 2021
Comparison of different convolutional neural network activation
  functions and methods for building ensembles
Comparison of different convolutional neural network activation functions and methods for building ensembles
L. Nanni
Gianluca Maguolo
S. Brahnam
M. Paci
266
9
0
29 Mar 2021
Robust Motion In-betweening
Robust Motion In-betweeningACM Transactions on Graphics (TOG), 2020
Félix G. Harvey
Mike Yurick
Derek Nowrouzezahrai
C. Pal
VGen
281
361
0
09 Feb 2021
When and How Can Deep Generative Models be Inverted?
When and How Can Deep Generative Models be Inverted?
Aviad Aberdam
Dror Simon
Michael Elad
344
13
0
28 Jun 2020
1
Page 1 of 1