ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.08031
  4. Cited By
QReLU and m-QReLU: Two novel quantum activation functions to aid medical
  diagnostics

QReLU and m-QReLU: Two novel quantum activation functions to aid medical diagnostics

15 October 2020
Luca Parisi
D. Neagu
R. Ma
F. Campean
    MedIm
ArXiv (abs)PDFHTML

Papers citing "QReLU and m-QReLU: Two novel quantum activation functions to aid medical diagnostics"

8 / 8 papers shown
Vectorized Attention with Learnable Encoding for Quantum Transformer
Vectorized Attention with Learnable Encoding for Quantum Transformer
Ziqing Guo
Ziwen Pan
Alex Khan
Jan Balewski
141
2
0
25 Aug 2025
A Universal Anti-Spoofing Approach for Contactless Fingerprint Biometric
  Systems
A Universal Anti-Spoofing Approach for Contactless Fingerprint Biometric Systems
Banafsheh Adami
Sara Tehranipoor
Nasser M. Nasrabadi
Nima Karimian
AAML
162
14
0
23 Oct 2023
Parametric Leaky Tanh: A New Hybrid Activation Function for Deep
  Learning
Parametric Leaky Tanh: A New Hybrid Activation Function for Deep Learning
S. Mastromichalakis
80
1
0
11 Aug 2023
A Quantum-Powered Photorealistic Rendering
A Quantum-Powered Photorealistic Rendering
Yuanfu Yang
Min Sun
AI4CE
247
0
0
07 Nov 2022
Knowledge Graph Embedding Methods for Entity Alignment: An Experimental
  Review
Knowledge Graph Embedding Methods for Entity Alignment: An Experimental ReviewData mining and knowledge discovery (DMKD), 2022
N. Fanourakis
Vasilis Efthymiou
D. Kotzinos
V. Christophides
250
62
0
17 Mar 2022
M-ar-K-Fast Independent Component Analysis
M-ar-K-Fast Independent Component Analysis
Luca Parisi
222
0
0
17 Aug 2021
ALReLU: A different approach on Leaky ReLU activation function to
  improve Neural Networks Performance
ALReLU: A different approach on Leaky ReLU activation function to improve Neural Networks Performance
S. Mastromichalakis
138
51
0
11 Dec 2020
hyper-sinh: An Accurate and Reliable Function from Shallow to Deep
  Learning in TensorFlow and Keras
hyper-sinh: An Accurate and Reliable Function from Shallow to Deep Learning in TensorFlow and Keras
Luca Parisi
R. Ma
Narrendar RaviChandran
Matteo Lanzillotta
122
32
0
15 Nov 2020
1
Page 1 of 1