ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.13944
  4. Cited By
Boosting LLM Translation Skills without General Ability Loss via
  Rationale Distillation

Boosting LLM Translation Skills without General Ability Loss via Rationale Distillation

17 October 2024
Junhong Wu
Yang Zhao
Yangyifan Xu
Bing Liu
Chengqing Zong
    CLL
ArXivPDFHTML

Papers citing "Boosting LLM Translation Skills without General Ability Loss via Rationale Distillation"

1 / 1 papers shown
Title
Enhancing Knowledge Distillation of Large Language Models through
  Efficient Multi-Modal Distribution Alignment
Enhancing Knowledge Distillation of Large Language Models through Efficient Multi-Modal Distribution Alignment
Tianyu Peng
Jiajun Zhang
16
2
0
19 Sep 2024
1