14
0

HIT Model: A Hierarchical Interaction-Enhanced Two-Tower Model for Pre-Ranking Systems

Main:7 Pages
7 Figures
Bibliography:1 Pages
4 Tables
Abstract

Online display advertising platforms rely on pre-ranking systems to efficiently filter and prioritize candidate ads from large corpora, balancing relevance to users with strict computational constraints. The prevailing two-tower architecture, though highly efficient due to its decoupled design and pre-caching, suffers from cross-domain interaction and coarse similarity metrics, undermining its capacity to model complex user-ad relationships. In this study, we propose the Hierarchical Interaction-Enhanced Two-Tower (HIT) model, a new architecture that augments the two-tower paradigm with two key components: generators\textit{generators} that pre-generate holistic vectors incorporating coarse-grained user-ad interactions through a dual-generator framework with a cosine-similarity-based generation loss as the training objective, and multi-head representers\textit{multi-head representers} that project embeddings into multiple latent subspaces to capture fine-grained, multi-faceted user interests and multi-dimensional ad attributes. This design enhances modeling effectiveness without compromising inference efficiency. Extensive experiments on public datasets and large-scale online A/B testing on Tencent's advertising platform demonstrate that HIT significantly outperforms several baselines in relevance metrics, yielding a 1.66%1.66\% increase in Gross Merchandise Volume and a 1.55%1.55\% improvement in Return on Investment, alongside similar serving latency to the vanilla two-tower models. The HIT model has been successfully deployed in Tencent's online display advertising system, serving billions of impressions daily. The code is available atthis https URL.

View on arXiv
@article{yang2025_2505.19849,
  title={ HIT Model: A Hierarchical Interaction-Enhanced Two-Tower Model for Pre-Ranking Systems },
  author={ Haoqiang Yang and Congde Yuan and Kun Bai and Mengzhuo Guo and Wei Yang and Chao Zhou },
  journal={arXiv preprint arXiv:2505.19849},
  year={ 2025 }
}
Comments on this paper