ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2510.07775
  4. Cited By
The Unintended Trade-off of AI Alignment:Balancing Hallucination Mitigation and Safety in LLMs

The Unintended Trade-off of AI Alignment:Balancing Hallucination Mitigation and Safety in LLMs

9 October 2025
Omar Mahmoud
Ali Khalil
B. L. Semage
Thommen George Karimpanal
Santu Rana
    HILM
ArXiv (abs)PDFHTMLGithub (10306★)

Papers citing "The Unintended Trade-off of AI Alignment:Balancing Hallucination Mitigation and Safety in LLMs"

0 / 0 papers shown
Title

No papers found