ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2507.22171
  4. Cited By
Enhancing Jailbreak Attacks on LLMs via Persona Prompts
v1v2 (latest)

Enhancing Jailbreak Attacks on LLMs via Persona Prompts

28 July 2025
Zheng Zhang
Peilin Zhao
Deheng Ye
Hao Wang
    AAML
ArXiv (abs)PDFHTMLGithub (5★)

Papers citing "Enhancing Jailbreak Attacks on LLMs via Persona Prompts"

0 / 0 papers shown
Title

No papers found