ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2307.05874
77
5

Multi-Object Tracking as Attention Mechanism

12 July 2023
Hiroshi Fukui
Taiki Miyagawa
Yusuke Morishita
    VOT
ArXivPDFHTML
Abstract

We propose a conceptually simple and thus fast multi-object tracking (MOT) model that does not require any attached modules, such as the Kalman filter, Hungarian algorithm, transformer blocks, or graph networks. Conventional MOT models are built upon the multi-step modules listed above, and thus the computational cost is high. Our proposed end-to-end MOT model, \textit{TicrossNet}, is composed of a base detector and a cross-attention module only. As a result, the overhead of tracking does not increase significantly even when the number of instances (NtN_tNt​) increases. We show that TicrossNet runs \textit{in real-time}; specifically, it achieves 32.6 FPS on MOT17 and 31.0 FPS on MOT20 (Tesla V100), which includes as many as >>>100 instances per frame. We also demonstrate that TicrossNet is robust to NtN_tNt​; thus, it does not have to change the size of the base detector, depending on NtN_tNt​, as is often done by other models for real-time processing.

View on arXiv
Comments on this paper