ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.01489
  4. Cited By
Not All Knowledge Is Created Equal: Mutual Distillation of Confident
  Knowledge

Not All Knowledge Is Created Equal: Mutual Distillation of Confident Knowledge

2 June 2021
Ziyun Li
Xinshao Wang
Diane Hu
N. Robertson
David Clifton
Christoph Meinel
Haojin Yang
ArXivPDFHTML

Papers citing "Not All Knowledge Is Created Equal: Mutual Distillation of Confident Knowledge"

2 / 2 papers shown
Title
Emergent Specialization: Rare Token Neurons in Language Models
Emergent Specialization: Rare Token Neurons in Language Models
Jing Liu
Haozheng Wang
Yueheng Li
MILM
LRM
12
0
0
19 May 2025
Data Selection for Efficient Model Update in Federated Learning
Data Selection for Efficient Model Update in Federated Learning
Hongrui Shi
Valentin Radu
FedML
39
5
0
05 Nov 2021
1