ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2508.15423
  4. Cited By
An Empirical Study of Knowledge Distillation for Code Understanding Tasks

An Empirical Study of Knowledge Distillation for Code Understanding Tasks

21 August 2025
Ruiqi Wang
Zezhou Yang
Cuiyun Gao
Xin Xia
Qing Liao
ArXiv (abs)PDFHTMLGithub (1★)

Papers citing "An Empirical Study of Knowledge Distillation for Code Understanding Tasks"

1 / 1 papers shown
Title
A Metamorphic Testing Perspective on Knowledge Distillation for Language Models of Code: Does the Student Deeply Mimic the Teacher?
A Metamorphic Testing Perspective on Knowledge Distillation for Language Models of Code: Does the Student Deeply Mimic the Teacher?
Md. Abdul Awal
Mrigank Rochan
Chanchal K. Roy
149
0
0
07 Nov 2025
1