ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.06240
  4. Cited By
CowClip: Reducing CTR Prediction Model Training Time from 12 hours to 10
  minutes on 1 GPU

CowClip: Reducing CTR Prediction Model Training Time from 12 hours to 10 minutes on 1 GPU

13 April 2022
Zangwei Zheng
Peng Xu
Xuan Zou
Da Tang
Zhen Li
Chenguang Xi
Peng Wu
Leqi Zou
Yijie Zhu
Ming-yue Chen
Xiangzhuo Ding
Fuzhao Xue
Ziheng Qing
Youlong Cheng
Yang You
    VLM
ArXivPDFHTML

Papers citing "CowClip: Reducing CTR Prediction Model Training Time from 12 hours to 10 minutes on 1 GPU"

4 / 4 papers shown
Title
High-Performance Large-Scale Image Recognition Without Normalization
High-Performance Large-Scale Image Recognition Without Normalization
Andrew Brock
Soham De
Samuel L. Smith
Karen Simonyan
VLM
220
512
0
11 Feb 2021
Distributed Hierarchical GPU Parameter Server for Massive Scale Deep
  Learning Ads Systems
Distributed Hierarchical GPU Parameter Server for Massive Scale Deep Learning Ads Systems
Weijie Zhao
Deping Xie
Ronglai Jia
Yulei Qian
Rui Ding
Mingming Sun
P. Li
MoE
57
150
0
12 Mar 2020
DeepFM: An End-to-End Wide & Deep Learning Framework for CTR Prediction
DeepFM: An End-to-End Wide & Deep Learning Framework for CTR Prediction
Huifeng Guo
Ruiming Tang
Yunming Ye
Zhenguo Li
Xiuqiang He
Zhenhua Dong
107
64
0
12 Apr 2018
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
273
2,886
0
15 Sep 2016
1