ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2303.17703
30
1

If At First You Don't Succeed: Test Time Re-ranking for Zero-shot, Cross-domain Retrieval

30 March 2023
Finlay G. C. Hudson
W. Smith
    ViT
ArXivPDFHTML
Abstract

In this paper, we introduce a novel method for zero-shot, cross-domain image retrieval. Our key contribution is a test-time Iterative Cluster-free Re-ranking process that leverages gallery-gallery feature information to establish semantic links between query and gallery images. This enables the retrieval of relevant images even when they do not exhibit similar visual features but share underlying semantic concepts. This can be combined with any pre-existing cross-domain feature extraction backbone to improve retrieval performance. However, when combined with a carefully chosen Vision Transformer backbone and combination of zero-shot retrieval losses, our approach yields state-of-the-art results on the Sketchy, TU-Berlin and QuickDraw sketch-based retrieval benchmarks. We show that our re-ranking also improves performance with other backbones and outperforms other re-ranking methods applied with our backbone. Importantly, unlike many previous methods, none of the components in our approach are engineered specifically towards the sketch-based image retrieval task - it can be generally applied to any cross-domain, zero-shot retrieval task. We therefore also present new results on zero-shot cartoon-to-photo and art-to-product retrieval using the Office-Home dataset. Project page:this http URL, code available at:this http URL

View on arXiv
@article{hudson2025_2303.17703,
  title={ If At First You Don't Succeed: Test Time Re-ranking for Zero-shot, Cross-domain Retrieval },
  author={ Finlay G. C. Hudson and William A. P. Smith },
  journal={arXiv preprint arXiv:2303.17703},
  year={ 2025 }
}
Comments on this paper