ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2412.15280
  4. Cited By
Context-DPO: Aligning Language Models for Context-Faithfulness

Context-DPO: Aligning Language Models for Context-Faithfulness

18 December 2024
Baolong Bi
Shaohan Huang
Y. Wang
Tianchi Yang
Zihan Zhang
Haizhen Huang
Lingrui Mei
Junfeng Fang
Z. Li
Furu Wei
Weiwei Deng
Feng Sun
Qi Zhang
Shenghua Liu
ArXivPDFHTML

Papers citing "Context-DPO: Aligning Language Models for Context-Faithfulness"

1 / 1 papers shown
Title
ConSens: Assessing context grounding in open-book question answering
ConSens: Assessing context grounding in open-book question answering
Ivan Vankov
Matyo Ivanov
Adriana Correia
Victor Botev
ELM
53
0
0
30 Apr 2025
1