ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.17220
  4. Cited By
Does Knowledge Distillation Matter for Large Language Model based Bundle Generation?

Does Knowledge Distillation Matter for Large Language Model based Bundle Generation?

24 April 2025
Kaidong Feng
Zhu Sun
Jie Yang
Hui Fang
Xinghua Qu
W. Liu
ArXivPDFHTML

Papers citing "Does Knowledge Distillation Matter for Large Language Model based Bundle Generation?"

Title
No papers