ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.07310
20
0

AllTracker: Efficient Dense Point Tracking at High Resolution

8 June 2025
Adam W. Harley
Yang You
Xinglong Sun
Yang Zheng
Nikhil Raghuraman
Yunqi Gu
Sheldon Liang
Wen-Hsuan Chu
Achal Dave
P. Tokmakov
Suya You
Rares Andrei Ambrus
Katerina Fragkiadaki
Leonidas Guibas
ArXiv (abs)PDFHTML
Main:7 Pages
8 Figures
Bibliography:3 Pages
11 Tables
Appendix:4 Pages
Abstract

We introduce AllTracker: a model that estimates long-range point tracks by way of estimating the flow field between a query frame and every other frame of a video. Unlike existing point tracking methods, our approach delivers high-resolution and dense (all-pixel) correspondence fields, which can be visualized as flow maps. Unlike existing optical flow methods, our approach corresponds one frame to hundreds of subsequent frames, rather than just the next frame. We develop a new architecture for this task, blending techniques from existing work in optical flow and point tracking: the model performs iterative inference on low-resolution grids of correspondence estimates, propagating information spatially via 2D convolution layers, and propagating information temporally via pixel-aligned attention layers. The model is fast and parameter-efficient (16 million parameters), and delivers state-of-the-art point tracking accuracy at high resolution (i.e., tracking 768x1024 pixels, on a 40G GPU). A benefit of our design is that we can train on a wider set of datasets, and we find that doing so is crucial for top performance. We provide an extensive ablation study on our architecture details and training recipe, making it clear which details matter most. Our code and model weights are available atthis https URL.

View on arXiv
@article{harley2025_2506.07310,
  title={ AllTracker: Efficient Dense Point Tracking at High Resolution },
  author={ Adam W. Harley and Yang You and Xinglong Sun and Yang Zheng and Nikhil Raghuraman and Yunqi Gu and Sheldon Liang and Wen-Hsuan Chu and Achal Dave and Pavel Tokmakov and Suya You and Rares Ambrus and Katerina Fragkiadaki and Leonidas J. Guibas },
  journal={arXiv preprint arXiv:2506.07310},
  year={ 2025 }
}
Comments on this paper