ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.07483
37
0

Poisoning Attacks to Local Differential Privacy Protocols for Trajectory Data

6 March 2025
I-Jung Hsu
Chih-Hsun Lin
Chia-Mu Yu
Sy-Yen Kuo
Chun-ying Huang
    AAML
ArXivPDFHTML
Abstract

Trajectory data, which tracks movements through geographic locations, is crucial for improving real-world applications. However, collecting such sensitive data raises considerable privacy concerns. Local differential privacy (LDP) offers a solution by allowing individuals to locally perturb their trajectory data before sharing it. Despite its privacy benefits, LDP protocols are vulnerable to data poisoning attacks, where attackers inject fake data to manipulate aggregated results. In this work, we make the first attempt to analyze vulnerabilities in several representative LDP trajectory protocols. We propose \textsc{TraP}, a heuristic algorithm for data \underline{P}oisoning attacks using a prefix-suffix method to optimize fake \underline{Tra}jectory selection, significantly reducing computational complexity. Our experimental results demonstrate that our attack can substantially increase target pattern occurrences in the perturbed trajectory dataset with few fake users. This study underscores the urgent need for robust defenses and better protocol designs to safeguard LDP trajectory data against malicious manipulation.

View on arXiv
@article{hsu2025_2503.07483,
  title={ Poisoning Attacks to Local Differential Privacy Protocols for Trajectory Data },
  author={ I-Jung Hsu and Chih-Hsun Lin and Chia-Mu Yu and Sy-Yen Kuo and Chun-Ying Huang },
  journal={arXiv preprint arXiv:2503.07483},
  year={ 2025 }
}
Comments on this paper