ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.15755
  4. Cited By
Adversarial Backdoor Attack by Naturalistic Data Poisoning on Trajectory
  Prediction in Autonomous Driving

Adversarial Backdoor Attack by Naturalistic Data Poisoning on Trajectory Prediction in Autonomous Driving

27 June 2023
Mozhgan Pourkeshavarz
Mohammad Sabokrou
Amir Rasouli
    AAML
ArXivPDFHTML

Papers citing "Adversarial Backdoor Attack by Naturalistic Data Poisoning on Trajectory Prediction in Autonomous Driving"

2 / 2 papers shown
Title
Realistic Adversarial Attacks for Robustness Evaluation of Trajectory Prediction Models via Future State Perturbation
Realistic Adversarial Attacks for Robustness Evaluation of Trajectory Prediction Models via Future State Perturbation
J. Schumann
Jeroen Hagenus
Frederik Baymler Mathiesen
Arkady Zgonnikov
AAML
23
0
0
09 May 2025
What-If Motion Prediction for Autonomous Driving
What-If Motion Prediction for Autonomous Driving
Siddhesh Khandelwal
William Qi
Jagjeet Singh
Andrew Hartnett
Deva Ramanan
114
115
0
24 Aug 2020
1