ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2209.01143
29
8

Future Gradient Descent for Adapting the Temporal Shifting Data Distribution in Online Recommendation Systems

2 September 2022
Mao Ye
Ruichen Jiang
Haoxiang Wang
Dhruv Choudhary
Xiaocong Du
Bhargav Bhushanam
Aryan Mokhtari
A. Kejariwal
Qiang Liu
    TTA
    OOD
    AI4TS
ArXivPDFHTML
Abstract

One of the key challenges of learning an online recommendation model is the temporal domain shift, which causes the mismatch between the training and testing data distribution and hence domain generalization error. To overcome, we propose to learn a meta future gradient generator that forecasts the gradient information of the future data distribution for training so that the recommendation model can be trained as if we were able to look ahead at the future of its deployment. Compared with Batch Update, a widely used paradigm, our theory suggests that the proposed algorithm achieves smaller temporal domain generalization error measured by a gradient variation term in a local regret. We demonstrate the empirical advantage by comparing with various representative baselines.

View on arXiv
Comments on this paper