ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.08551
  4. Cited By
HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain
  Language Model Compression

HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression

16 October 2021
Chenhe Dong
Yaliang Li
Ying Shen
Minghui Qiu
    VLM
ArXiv (abs)PDFHTMLGithub (4★)

Papers citing "HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression"

3 / 3 papers shown
Learning Task-Agnostic Representations through Multi-Teacher Distillation
Learning Task-Agnostic Representations through Multi-Teacher Distillation
Philippe Formont
Maxime Darrin
Banafsheh Karimian
Jackie Chi Kit Cheung
Eric Granger
Ismail Ben Ayed
Mohammadhadi Shateri
Pablo Piantanida
155
0
0
21 Oct 2025
Direct Distillation between Different Domains
Direct Distillation between Different DomainsEuropean Conference on Computer Vision (ECCV), 2024
Jialiang Tang
Shuo Chen
Gang Niu
Hongyuan Zhu
Qiufeng Wang
Chen Gong
Masashi Sugiyama
315
6
0
12 Jan 2024
Are Intermediate Layers and Labels Really Necessary? A General Language
  Model Distillation Method
Are Intermediate Layers and Labels Really Necessary? A General Language Model Distillation MethodAnnual Meeting of the Association for Computational Linguistics (ACL), 2023
Shicheng Tan
Weng Lam Tam
Yuanchun Wang
Wenwen Gong
Shuo Zhao
Peng Zhang
Jie Tang
VLM
140
1
0
11 Jun 2023
1