Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2207.07563
Cited By
QSAN: A Near-term Achievable Quantum Self-Attention Network
14 July 2022
Jinjing Shi
Ren-Xin Zhao
Wenxuan Wang
Shenmin Zhang
Xuelong Li
Re-assign community
ArXiv
PDF
HTML
Papers citing
"QSAN: A Near-term Achievable Quantum Self-Attention Network"
10 / 10 papers shown
Title
Integrating Quantum-Classical Attention in Patch Transformers for Enhanced Time Series Forecasting
Sanjay Chakraborty
Fredrik Heintz
AI4TS
AIFin
32
0
0
31 Mar 2025
Quantum Complex-Valued Self-Attention Model
Fu Chen
Qinglin Zhao
Li Feng
Longfei Tang
Yangbin Lin
Haitao Huang
MQ
47
0
0
24 Mar 2025
Hybrid quantum-classical convolutional neural network for phytoplankton classification
S. Shi
Zhimin Wang
R. Shang
Yanan Li
Jiaxin Li
Guoqiang Zhong
Yongjian Gu
19
6
0
07 Mar 2023
Transformer Quality in Linear Time
Weizhe Hua
Zihang Dai
Hanxiao Liu
Quoc V. Le
71
220
0
21 Feb 2022
The Dawn of Quantum Natural Language Processing
R. Sipio
Jia-Hong Huang
Samuel Yen-Chi Chen
Stefano Mangini
M. Worring
48
79
0
13 Oct 2021
Gaussian Kernelized Self-Attention for Long Sequence Data and Its Application to CTC-based Speech Recognition
Yosuke Kashiwagi
E. Tsunoo
Shinji Watanabe
AI4TS
16
5
0
18 Feb 2021
Bottleneck Transformers for Visual Recognition
A. Srinivas
Tsung-Yi Lin
Niki Parmar
Jonathon Shlens
Pieter Abbeel
Ashish Vaswani
SLR
265
955
0
27 Jan 2021
SG-Net: Syntax Guided Transformer for Language Representation
Zhuosheng Zhang
Yuwei Wu
Junru Zhou
Sufeng Duan
Hai Zhao
Rui-cang Wang
32
36
0
27 Dec 2020
Noise-Induced Barren Plateaus in Variational Quantum Algorithms
Samson Wang
Enrico Fontana
M. Cerezo
Kunal Sharma
A. Sone
L. Cincio
Patrick J. Coles
116
645
0
28 Jul 2020
Efficient Content-Based Sparse Attention with Routing Transformers
Aurko Roy
M. Saffar
Ashish Vaswani
David Grangier
MoE
228
578
0
12 Mar 2020
1