ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.06609
31
0

InteractRank: Personalized Web-Scale Search Pre-Ranking with Cross Interaction Features

9 April 2025
Sujay Khandagale
Bhawna Juneja
Prabhat Agarwal
Aditya Subramanian
Jaewon Yang
Yuting Wang
ArXivPDFHTML
Abstract

Modern search systems use a multi-stage architecture to deliver personalized results efficiently. Key stages include retrieval, pre-ranking, full ranking, and blending, which refine billions of items to top selections. The pre-ranking stage, vital for scoring and filtering hundreds of thousands of items down to a few thousand, typically relies on two tower models due to their computational efficiency, despite often lacking in capturing complex interactions. While query-item cross interaction features are paramount for full ranking, integrating them into pre-ranking models presents efficiency-related challenges. In this paper, we introduce InteractRank, a novel two tower pre-ranking model with robust cross interaction features used at Pinterest. By incorporating historical user engagement-based query-item interactions in the scoring function along with the two tower dot product, InteractRank significantly boosts pre-ranking performance with minimal latency and computation costs. In real-world A/B experiments at Pinterest, InteractRank improves the online engagement metric by 6.5% over a BM25 baseline and by 3.7% over a vanilla two tower baseline. We also highlight other components of InteractRank, like real-time user-sequence modeling, and analyze their contributions through offline ablation studies. The code for InteractRank is available atthis https URL.

View on arXiv
@article{khandagale2025_2504.06609,
  title={ InteractRank: Personalized Web-Scale Search Pre-Ranking with Cross Interaction Features },
  author={ Sujay Khandagale and Bhawna Juneja and Prabhat Agarwal and Aditya Subramanian and Jaewon Yang and Yuting Wang },
  journal={arXiv preprint arXiv:2504.06609},
  year={ 2025 }
}
Comments on this paper