ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.10210
50
0

TARS: Traffic-Aware Radar Scene Flow Estimation

13 March 2025
Jialong Wu
Marco Braun
Dominic Spata
Matthias Rottmann
ArXivPDFHTML
Abstract

Scene flow provides crucial motion information for autonomous driving. Recent LiDAR scene flow models utilize the rigid-motion assumption at the instance level, assuming objects are rigid bodies. However, these instance-level methods are not suitable for sparse radar point clouds. In this work, we present a novel T\textbf{T}Traffic-A\textbf{A}Aware R\textbf{R}Radar S\textbf{S}Scene flow estimation method, named TARS\textbf{TARS}TARS, which utilizes the motion rigidity at the traffic level. To address the challenges in radar scene flow, we perform object detection and scene flow jointly and boost the latter. We incorporate the feature map from the object detector, trained with detection losses, to make radar scene flow aware of the environment and road users. Therefrom, we construct a Traffic Vector Field (TVF) in the feature space, enabling a holistic traffic-level scene understanding in our scene flow branch. When estimating the scene flow, we consider both point-level motion cues from point neighbors and traffic-level consistency of rigid motion within the space. TARS outperforms the state of the art on a proprietary dataset and the View-of-Delft dataset, improving the benchmarks by 23% and 15%, respectively.

View on arXiv
@article{wu2025_2503.10210,
  title={ TARS: Traffic-Aware Radar Scene Flow Estimation },
  author={ Jialong Wu and Marco Braun and Dominic Spata and Matthias Rottmann },
  journal={arXiv preprint arXiv:2503.10210},
  year={ 2025 }
}
Comments on this paper