ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.05213
  4. Cited By
Towards Federated Learning Under Resource Constraints via Layer-wise
  Training and Depth Dropout

Towards Federated Learning Under Resource Constraints via Layer-wise Training and Depth Dropout

11 September 2023
Pengfei Guo
Warren Morningstar
Raviteja Vemulapalli
K. Singhal
Vishal M. Patel
Philip Mansfield
    FedML
ArXiv (abs)PDFHTML

Papers citing "Towards Federated Learning Under Resource Constraints via Layer-wise Training and Depth Dropout"

2 / 2 papers shown
Title
Resource-Efficient Federated Multimodal Learning via Layer-wise and
  Progressive Training
Resource-Efficient Federated Multimodal Learning via Layer-wise and Progressive Training
Ye Lin Tun
Chu Myaet Thwal
Minh N. H. Nguyen
Choong Seon Hong
177
3
0
22 Jul 2024
LW-FedSSL: Resource-efficient Layer-wise Federated Self-supervised Learning
LW-FedSSL: Resource-efficient Layer-wise Federated Self-supervised Learning
Ye Lin Tun
Chu Myaet Thwal
Le Quang Huy
Minh N. H. Nguyen
Choong Seon Hong
FedML
340
2
0
22 Jan 2024
1