ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2404.04270
  4. Cited By
Accelerating Recommender Model Training by Dynamically Skipping Stale
  Embeddings

Accelerating Recommender Model Training by Dynamically Skipping Stale Embeddings

22 March 2024
Yassaman Ebrahimzadeh Maboud
Muhammad Adnan
Divyat Mahajan
Prashant J. Nair
    AI4TS
ArXivPDFHTML

Papers citing "Accelerating Recommender Model Training by Dynamically Skipping Stale Embeddings"

2 / 2 papers shown
Title
Distributed Hierarchical GPU Parameter Server for Massive Scale Deep
  Learning Ads Systems
Distributed Hierarchical GPU Parameter Server for Massive Scale Deep Learning Ads Systems
Weijie Zhao
Deping Xie
Ronglai Jia
Yulei Qian
Rui Ding
Mingming Sun
P. Li
MoE
57
150
0
12 Mar 2020
RecNMP: Accelerating Personalized Recommendation with Near-Memory
  Processing
RecNMP: Accelerating Personalized Recommendation with Near-Memory Processing
Liu Ke
Udit Gupta
Carole-Jean Wu
B. Cho
Mark Hempstead
...
Dheevatsa Mudigere
Maxim Naumov
Martin D. Schatz
M. Smelyanskiy
Xiaodong Wang
41
212
0
30 Dec 2019
1