ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.07513
  4. Cited By
Evolutionary Contrastive Distillation for Language Model Alignment

Evolutionary Contrastive Distillation for Language Model Alignment

Conference on Empirical Methods in Natural Language Processing (EMNLP), 2024
10 October 2024
Julian Katz-Samuels
Zheng Li
Hyokun Yun
Priyanka Nigam
Yi Xu
Vaclav Petricek
Bing Yin
Trishul Chilimbi
    ALMSyDa
ArXiv (abs)PDFHTMLGithub (131519★)

Papers citing "Evolutionary Contrastive Distillation for Language Model Alignment"

1 / 1 papers shown
WizardCoder: Empowering Code Large Language Models with Evol-Instruct
WizardCoder: Empowering Code Large Language Models with Evol-InstructInternational Conference on Learning Representations (ICLR), 2023
Ziyang Luo
Can Xu
Lu Wang
Qingfeng Sun
Xiubo Geng
Wenxiang Hu
Chongyang Tao
Jing Ma
Qingwei Lin
Daxin Jiang
ELMSyDaALM
945
929
0
14 Jun 2023
1
Page 1 of 1