11
1

Bridging Classical and Quantum Machine Learning: Knowledge Transfer From Classical to Quantum Neural Networks Using Knowledge Distillation

Abstract

Quantum neural networks (QNNs), harnessing superposition and entanglement, have shown potential to surpass classical methods in complex learning tasks but remain limited by hardware constraints and noisy conditions. In this work, we present a novel framework for transferring knowledge from classical convolutional neural networks (CNNs) to QNNs via knowledge distillation, thereby reducing the need for resource intensive quantum training and error mitigation. We conduct extensive experiments using two parameterized quantum circuits (PQCs) with 4 and 8 qubits on MNIST, Fashion MNIST, and CIFAR10 datasets. The approach demonstrates consistent accuracy improvements attributed to distilled knowledge from larger classical networks. Through ablation studies, we systematically compare the effect of state of the art dimensionality reduction techniques fully connected layers, center cropping, principal component analysis, and pooling to compress high-dimensional image data prior to quantum encoding. Our findings reveal that fully connected layers retain the most salient features for QNN inference, thereby surpassing other down sampling approaches. Additionally, we examine state of the art data encoding methods (amplitude, angle, and qubit encoding) and identify amplitude encoding as the optimal strategy, yielding superior accuracy across all tested datasets and qubit configurations. Through computational analyses, we show that our distilled 4-qubit and 8-qubit QNNs achieve competitive performance while utilizing significantly fewer parameters than their classical counterparts. Our results establish a promising paradigm for bridging classical deep learning and emerging quantum computing, paving the way for more powerful, resource conscious models in quantum machine intelligence.

View on arXiv
@article{hasan2025_2311.13810,
  title={ Bridging Classical and Quantum Machine Learning: Knowledge Transfer From Classical to Quantum Neural Networks Using Knowledge Distillation },
  author={ Mohammad Junayed Hasan and M.R.C.Mahdy },
  journal={arXiv preprint arXiv:2311.13810},
  year={ 2025 }
}
Comments on this paper