ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.09817
  4. Cited By
Lottery Hypothesis based Unsupervised Pre-training for Model Compression
  in Federated Learning

Lottery Hypothesis based Unsupervised Pre-training for Model Compression in Federated Learning

IEEE Vehicular Technology Conference (VTC), 2020
21 April 2020
Sohei Itahara
Takayuki Nishio
M. Morikura
Koji Yamamoto
ArXiv (abs)PDFHTML

Papers citing "Lottery Hypothesis based Unsupervised Pre-training for Model Compression in Federated Learning"

5 / 5 papers shown
A Survey of Lottery Ticket Hypothesis
A Survey of Lottery Ticket Hypothesis
Bohan Liu
Zijie Zhang
Peixiong He
Zhensen Wang
Yang Xiao
Ruimeng Ye
Yang Zhou
Wei-Shinn Ku
Bo Hui
UQCV
286
22
0
07 Mar 2024
FedSSA: Semantic Similarity-based Aggregation for Efficient
  Model-Heterogeneous Personalized Federated Learning
FedSSA: Semantic Similarity-based Aggregation for Efficient Model-Heterogeneous Personalized Federated LearningInternational Joint Conference on Artificial Intelligence (IJCAI), 2023
Liping Yi
Han Yu
Zhuan Shi
Gang Wang
Xiaoguang Liu
Lizhen Cui
Xiaoxiao Li
FedML
476
15
0
14 Dec 2023
Efficient Federated Learning with Enhanced Privacy via Lottery Ticket
  Pruning in Edge Computing
Efficient Federated Learning with Enhanced Privacy via Lottery Ticket Pruning in Edge ComputingIEEE Transactions on Mobile Computing (IEEE TMC), 2023
Yi Shi
Kang Wei
Li Shen
Jun Li
Xueqian Wang
Bo Yuan
Song Guo
272
8
0
02 May 2023
AutoFL: Enabling Heterogeneity-Aware Energy Efficient Federated Learning
AutoFL: Enabling Heterogeneity-Aware Energy Efficient Federated LearningMicro (MICRO), 2021
Young Geun Kim
Carole-Jean Wu
263
107
0
16 Jul 2021
Distillation-Based Semi-Supervised Federated Learning for
  Communication-Efficient Collaborative Training with Non-IID Private Data
Distillation-Based Semi-Supervised Federated Learning for Communication-Efficient Collaborative Training with Non-IID Private Data
Sohei Itahara
Takayuki Nishio
Yusuke Koda
M. Morikura
Koji Yamamoto
FedML
312
331
0
14 Aug 2020
1
Page 1 of 1