ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2404.06059
19
2

Efficient Quantum Circuits for Machine Learning Activation Functions including Constant T-depth ReLU

9 April 2024
Wei Zi
Siyi Wang
Hyunji Kim
Xiaoming Sun
Anupam Chattopadhyay
P. Rebentrost
ArXivPDFHTML
Abstract

In recent years, Quantum Machine Learning (QML) has increasingly captured the interest of researchers. Among the components in this domain, activation functions hold a fundamental and indispensable role. Our research focuses on the development of activation functions quantum circuits for integration into fault-tolerant quantum computing architectures, with an emphasis on minimizing TTT-depth. Specifically, we present novel implementations of ReLU and leaky ReLU activation functions, achieving constant TTT-depths of 4 and 8, respectively. Leveraging quantum lookup tables, we extend our exploration to other activation functions such as the sigmoid. This approach enables us to customize precision and TTT-depth by adjusting the number of qubits, making our results more adaptable to various application scenarios. This study represents a significant advancement towards enhancing the practicality and application of quantum machine learning.

View on arXiv
Comments on this paper