ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2103.13733
  4. Cited By
Spirit Distillation: Precise Real-time Semantic Segmentation of Road
  Scenes with Insufficient Data
v1v2 (latest)

Spirit Distillation: Precise Real-time Semantic Segmentation of Road Scenes with Insufficient Data

25 March 2021
Zhiyuan Wu
Yu-Gang Jiang
Chupeng Cui
Zongmin Yang
Xinhui Xue
Hong Qi
ArXiv (abs)PDFHTML

Papers citing "Spirit Distillation: Precise Real-time Semantic Segmentation of Road Scenes with Insufficient Data"

4 / 4 papers shown
Knowledge Distillation in Federated Edge Learning: A Survey
Knowledge Distillation in Federated Edge Learning: A Survey
Zhiyuan Wu
Sheng Sun
Yuwei Wang
Min Liu
Xue Jiang
Runhan Li
Bo Gao
FedML
280
10
0
14 Jan 2023
FedICT: Federated Multi-task Distillation for Multi-access Edge Computing
FedICT: Federated Multi-task Distillation for Multi-access Edge ComputingIEEE Transactions on Parallel and Distributed Systems (TPDS), 2023
Zhiyuan Wu
Sheng Sun
Yuwei Wang
Min Liu
Quyang Pan
Xue Jiang
Bo Gao
459
46
0
01 Jan 2023
Exploring the Distributed Knowledge Congruence in Proxy-data-free
  Federated Distillation
Exploring the Distributed Knowledge Congruence in Proxy-data-free Federated DistillationACM Transactions on Intelligent Systems and Technology (ACM TIST), 2022
Zhiyuan Wu
Sheng Sun
Yuwei Wang
Min Liu
Quyang Pan
Junbo Zhang
Zeju Li
Qing Liu
FedML
372
34
0
14 Apr 2022
TransKD: Transformer Knowledge Distillation for Efficient Semantic
  Segmentation
TransKD: Transformer Knowledge Distillation for Efficient Semantic Segmentation
R. Liu
Kailun Yang
Alina Roitberg
Kailai Li
Kunyu Peng
Huayao Liu
Yaonan Wang
Rainer Stiefelhagen
ViT
276
56
0
27 Feb 2022
1