ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2009.13264
  4. Cited By
A Derivative-free Method for Quantum Perceptron Training in
  Multi-layered Neural Networks

A Derivative-free Method for Quantum Perceptron Training in Multi-layered Neural Networks

23 September 2020
T. M. Khan
A. Robles-Kelly
ArXiv (abs)PDFHTML

Papers citing "A Derivative-free Method for Quantum Perceptron Training in Multi-layered Neural Networks"

3 / 3 papers shown
Title
Neural Network Compression by Joint Sparsity Promotion and Redundancy
  Reduction
Neural Network Compression by Joint Sparsity Promotion and Redundancy Reduction
T. M. Khan
Syed S. Naqvi
A. Robles-Kelly
Erik H. W. Meijering
84
7
0
14 Oct 2022
A Leap among Quantum Computing and Quantum Neural Networks: A Survey
A Leap among Quantum Computing and Quantum Neural Networks: A Survey
F. V. Massoli
Lucia Vadicamo
Giuseppe Amato
Fabrizio Falchi
69
34
0
06 Jul 2021
Quantum machine learning with differential privacy
Quantum machine learning with differential privacy
William Watkins
Samuel Yen-Chi Chen
Shinjae Yoo
95
49
0
10 Mar 2021
1