ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2403.11163
  4. Cited By
A Selective Review on Statistical Methods for Massive Data Computation:
  Distributed Computing, Subsampling, and Minibatch Techniques

A Selective Review on Statistical Methods for Massive Data Computation: Distributed Computing, Subsampling, and Minibatch Techniques

17 March 2024
Xuetong Li
Yuan Gao
Hong Chang
Danyang Huang
Yingying Ma
Rui Pan
Haobo Qi
Feifei Wang
Shuyuan Wu
Ke Xu
Jing Zhou
Xuening Zhu
Yingqiu Zhu
Hansheng Wang
ArXivPDFHTML

Papers citing "A Selective Review on Statistical Methods for Massive Data Computation: Distributed Computing, Subsampling, and Minibatch Techniques"

2 / 2 papers shown
Title
Network Gradient Descent Algorithm for Decentralized Federated Learning
Network Gradient Descent Algorithm for Decentralized Federated Learning
Shuyuan Wu
Danyang Huang
Hansheng Wang
FedML
20
11
0
06 May 2022
Optimal subsampling for quantile regression in big data
Optimal subsampling for quantile regression in big data
Haiying Wang
Yanyuan Ma
70
129
0
28 Jan 2020
1