ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.13417
  4. Cited By
RLTHF: Targeted Human Feedback for LLM Alignment

RLTHF: Targeted Human Feedback for LLM Alignment

24 February 2025
Yifei Xu
Tusher Chakraborty
Emre Kıcıman
Bibek Aryal
Eduardo Rodrigues
Srinagesh Sharma
Roberto Estevão
M. A. D. L. Balaguer
Jessica Wolk
Rafael Padilha
Leonardo Nunes
Shobana Balakrishnan
Songwu Lu
Ranveer Chandra
ArXivPDFHTML

Papers citing "RLTHF: Targeted Human Feedback for LLM Alignment"

Title
No papers