Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2408.00307
Cited By
ABC Align: Large Language Model Alignment for Safety & Accuracy
1 August 2024
Gareth Seneque
Lap-Hang Ho
Peter W. Glynn
Yinyu Ye
Jeffrey Molendijk
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"ABC Align: Large Language Model Alignment for Safety & Accuracy"
1 / 1 papers shown
Position: We Need Responsible, Application-Driven (RAD) AI Research
Sarah Hartman
Cheng Soon Ong
Julia Powles
Petra Kuhnert
361
2
0
07 May 2025
1