ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2401.00622
  4. Cited By
Federated Class-Incremental Learning with New-Class Augmented
  Self-Distillation

Federated Class-Incremental Learning with New-Class Augmented Self-Distillation

1 January 2024
Zhiyuan Wu
Tianliu He
Sheng Sun
Yuwei Wang
Min Liu
Bo Gao
Xue Jiang
    CLL
    FedML
ArXivPDFHTML

Papers citing "Federated Class-Incremental Learning with New-Class Augmented Self-Distillation"

3 / 3 papers shown
Title
Knowledge Distillation in Federated Edge Learning: A Survey
Knowledge Distillation in Federated Edge Learning: A Survey
Zhiyuan Wu
Sheng Sun
Yuwei Wang
Min Liu
Xue Jiang
Runhan Li
Bo Gao
FedML
27
4
0
14 Jan 2023
FedML: A Research Library and Benchmark for Federated Machine Learning
FedML: A Research Library and Benchmark for Federated Machine Learning
Chaoyang He
Songze Li
Jinhyun So
Xiao Zeng
Mi Zhang
...
Yang Liu
Ramesh Raskar
Qiang Yang
M. Annavaram
Salman Avestimehr
FedML
168
564
0
27 Jul 2020
Large scale distributed neural network training through online
  distillation
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
275
404
0
09 Apr 2018
1