ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.13405
18
0

ProgRoCC: A Progressive Approach to Rough Crowd Counting

18 April 2025
Shengqin Jiang
Linfei Li
Haokui Zhang
Qingshan Liu
Amin Beheshti
Jian Yang
Anton van den Hengel
Quan Z. Sheng
Yuankai Qi
ArXivPDFHTML
Abstract

As the number of individuals in a crowd grows, enumeration-based techniques become increasingly infeasible and their estimates increasingly unreliable. We propose instead an estimation-based version of the problem: we label Rough Crowd Counting that delivers better accuracy on the basis of training data that is easier to acquire. Rough crowd counting requires only rough annotations of the number of targets in an image, instead of the more traditional, and far more expensive, per-target annotations. We propose an approach to the rough crowd counting problem based on CLIP, termed ProgRoCC. Specifically, we introduce a progressive estimation learning strategy that determines the object count through a coarse-to-fine approach. This approach delivers answers quickly, outperforms the state-of-the-art in semi- and weakly-supervised crowd counting. In addition, we design a vision-language matching adapter that optimizes key-value pairs by mining effective matches of two modalities to refine the visual features, thereby improving the final performance. Extensive experimental results on three widely adopted crowd counting datasets demonstrate the effectiveness of our method.

View on arXiv
@article{jiang2025_2504.13405,
  title={ ProgRoCC: A Progressive Approach to Rough Crowd Counting },
  author={ Shengqin Jiang and Linfei Li and Haokui Zhang and Qingshan Liu and Amin Beheshti and Jian Yang and Anton van den Hengel and Quan Z. Sheng and Yuankai Qi },
  journal={arXiv preprint arXiv:2504.13405},
  year={ 2025 }
}
Comments on this paper