Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2111.05897
Cited By
Persia: An Open, Hybrid System Scaling Deep Learning-based Recommenders up to 100 Trillion Parameters
10 November 2021
Xiangru Lian
Binhang Yuan
Xuefeng Zhu
Yulong Wang
Yongjun He
Honghuan Wu
Lei Sun
H. Lyu
Chengjun Liu
Xing Dong
Yiqiao Liao
Mingnan Luo
Congfei Zhang
Jingru Xie
Haonan Li
Lei Chen
Renjie Huang
Jianying Lin
Chengchun Shu
Xue-Bo Qiu
Zhishan Liu
Dongying Kong
Lei Yuan
Hai-bo Yu
Sen Yang
Ce Zhang
Ji Liu
VLM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Persia: An Open, Hybrid System Scaling Deep Learning-based Recommenders up to 100 Trillion Parameters"
6 / 6 papers shown
Title
Models Are Codes: Towards Measuring Malicious Code Poisoning Attacks on Pre-trained Model Hubs
Jian Zhao
Shenao Wang
Yanjie Zhao
Xinyi Hou
Kailong Wang
Peiming Gao
Yuanchao Zhang
Chen Wei
Haoyu Wang
31
10
0
14 Sep 2024
Fine-Grained Embedding Dimension Optimization During Training for Recommender Systems
Qinyi Luo
Penghan Wang
Wei Zhang
Fan Lai
Jiachen Mao
...
Jun Song
Wei-Yu Tsai
Shuai Yang
Yuxi Hu
Xuehai Qian
45
0
0
09 Jan 2024
CowClip: Reducing CTR Prediction Model Training Time from 12 hours to 10 minutes on 1 GPU
Zangwei Zheng
Peng Xu
Xuan Zou
Da Tang
Zhen Li
...
Xiangzhuo Ding
Fuzhao Xue
Ziheng Qing
Youlong Cheng
Yang You
VLM
37
7
0
13 Apr 2022
Deep Learning Training in Facebook Data Centers: Design of Scale-up and Scale-out Systems
Maxim Naumov
John Kim
Dheevatsa Mudigere
Srinivas Sridharan
Xiaodong Wang
...
Krishnakumar Nair
Isabel Gao
Bor-Yiing Su
Jiyan Yang
M. Smelyanskiy
GNN
41
83
0
20 Mar 2020
Distributed Hierarchical GPU Parameter Server for Massive Scale Deep Learning Ads Systems
Weijie Zhao
Deping Xie
Ronglai Jia
Yulei Qian
Rui Ding
Mingming Sun
P. Li
MoE
57
150
0
12 Mar 2020
Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism
M. Shoeybi
M. Patwary
Raul Puri
P. LeGresley
Jared Casper
Bryan Catanzaro
MoE
245
1,817
0
17 Sep 2019
1