ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2207.06667
  4. Cited By
Large-scale Knowledge Distillation with Elastic Heterogeneous Computing
  Resources

Large-scale Knowledge Distillation with Elastic Heterogeneous Computing Resources

14 July 2022
Ji Liu
Daxiang Dong
Xi Wang
An Qin
Xingjian Li
P. Valduriez
Dejing Dou
Dianhai Yu
ArXivPDFHTML

Papers citing "Large-scale Knowledge Distillation with Elastic Heterogeneous Computing Resources"

2 / 2 papers shown
Title
Distributed and Deep Vertical Federated Learning with Big Data
Distributed and Deep Vertical Federated Learning with Big Data
Ji Liu
Xuehai Zhou
L. Mo
Shilei Ji
Yuan Liao
Z. Li
Qinhua Gu
Dejing Dou
FedML
24
16
0
08 Mar 2023
Large scale distributed neural network training through online
  distillation
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
272
404
0
09 Apr 2018
1