ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.05605
19
0

The Evolution of Embedding Table Optimization and Multi-Epoch Training in Pinterest Ads Conversion

8 May 2025
Andrew Qiu
Shubham Barhate
Hin Wai Lui
Runze Su
Rafael Rios Müller
Kungang Li
Ling Leng
Han Sun
Shayan Ehsani
Zhifang Liu
ArXivPDFHTML
Abstract

Deep learning for conversion prediction has found widespread applications in online advertising. These models have become more complex as they are trained to jointly predict multiple objectives such as click, add-to-cart, checkout and other conversion types. Additionally, the capacity and performance of these models can often be increased with the use of embedding tables that encode high cardinality categorical features such as advertiser, user, campaign, and product identifiers (IDs). These embedding tables can be pre-trained, but also learned end-to-end jointly with the model to directly optimize the model objectives. Training these large tables is challenging due to: gradient sparsity, the high cardinality of the categorical features, the non-uniform distribution of IDs and the very high label sparsity. These issues make training prone to both slow convergence and overfitting after the first epoch. Previous works addressed the multi-epoch overfitting issue by using: stronger feature hashing to reduce cardinality, filtering of low frequency IDs, regularization of the embedding tables, re-initialization of the embedding tables after each epoch, etc. Some of these techniques reduce overfitting at the expense of reduced model performance if used too aggressively. In this paper, we share key learnings from the development of embedding table optimization and multi-epoch training in Pinterest Ads Conversion models. We showcase how our Sparse Optimizer speeds up convergence, and how multi-epoch overfitting varies in severity between different objectives in a multi-task model depending on label sparsity. We propose a new approach to deal with multi-epoch overfitting: the use of a frequency-adaptive learning rate on the embedding tables and compare it to embedding re-initialization. We evaluate both methods offline using an industrial large-scale production dataset.

View on arXiv
@article{qiu2025_2505.05605,
  title={ The Evolution of Embedding Table Optimization and Multi-Epoch Training in Pinterest Ads Conversion },
  author={ Andrew Qiu and Shubham Barhate and Hin Wai Lui and Runze Su and Rafael Rios Müller and Kungang Li and Ling Leng and Han Sun and Shayan Ehsani and Zhifang Liu },
  journal={arXiv preprint arXiv:2505.05605},
  year={ 2025 }
}
Comments on this paper