ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.03454
53
0

Data Poisoning Attacks to Locally Differentially Private Range Query Protocols

5 March 2025
Ting-Wei Liao
Chih-Hsun Lin
Yu-Lin Tsai
Takao Murakami
Chia-Mu Yu
Jun Sakuma
Chun-ying Huang
Hiroaki Kikuchi
    AAML
ArXivPDFHTML
Abstract

Local Differential Privacy (LDP) has been widely adopted to protect user privacy in decentralized data collection. However, recent studies have revealed that LDP protocols are vulnerable to data poisoning attacks, where malicious users manipulate their reported data to distort aggregated results. In this work, we present the first study on data poisoning attacks targeting LDP range query protocols, focusing on both tree-based and grid-based approaches. We identify three key challenges in executing such attacks, including crafting consistent and effective fake data, maintaining data consistency across levels or grids, and preventing server detection. To address the first two challenges, we propose novel attack methods that are provably optimal, including a tree-based attack and a grid-based attack, designed to manipulate range query results with high effectiveness. \textbf{Our key finding is that the common post-processing procedure, Norm-Sub, in LDP range query protocols can help the attacker massively amplify their attack effectiveness.} In addition, we study a potential countermeasure, but also propose an adaptive attack capable of evading this defense to address the third challenge. We evaluate our methods through theoretical analysis and extensive experiments on synthetic and real-world datasets. Our results show that the proposed attacks can significantly amplify estimations for arbitrary range queries by manipulating a small fraction of users, providing 5-10x more influence than a normal user to the estimation.

View on arXiv
@article{liao2025_2503.03454,
  title={ Data Poisoning Attacks to Locally Differentially Private Range Query Protocols },
  author={ Ting-Wei Liao and Chih-Hsun Lin and Yu-Lin Tsai and Takao Murakami and Chia-Mu Yu and Jun Sakuma and Chun-Ying Huang and Hiroaki Kikuchi },
  journal={arXiv preprint arXiv:2503.03454},
  year={ 2025 }
}
Comments on this paper