ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.12806
32
0

A Numerical Gradient Inversion Attack in Variational Quantum Neural-Networks

17 April 2025
Georgios Papadopoulos
Shaltiel Eloul
Yash Satsangi
Jamie Heredge
Niraj Kumar
Chun-Fu Chen
Marco Pistoia
ArXivPDFHTML
Abstract

The loss landscape of Variational Quantum Neural Networks (VQNNs) is characterized by local minima that grow exponentially with increasing qubits. Because of this, it is more challenging to recover information from model gradients during training compared to classical Neural Networks (NNs). In this paper we present a numerical scheme that successfully reconstructs input training, real-world, practical data from trainable VQNNs' gradients. Our scheme is based on gradient inversion that works by combining gradients estimation with the finite difference method and adaptive low-pass filtering. The scheme is further optimized with Kalman filter to obtain efficient convergence. Our experiments show that our algorithm can invert even batch-trained data, given the VQNN model is sufficiently over-parameterized.

View on arXiv
@article{papadopoulos2025_2504.12806,
  title={ A Numerical Gradient Inversion Attack in Variational Quantum Neural-Networks },
  author={ Georgios Papadopoulos and Shaltiel Eloul and Yash Satsangi and Jamie Heredge and Niraj Kumar and Chun-Fu Chen and Marco Pistoia },
  journal={arXiv preprint arXiv:2504.12806},
  year={ 2025 }
}
Comments on this paper