33
0

Personalized Text Generation with Contrastive Activation Steering

Abstract

Personalized text generation aims to infer users' writing style preferences from their historical texts and generate outputs that faithfully reflect these stylistic characteristics. Existing solutions primarily adopt two paradigms: retrieval-augmented generation (RAG) and parameter-efficient fine-tuning (PEFT). While these approaches have advanced the field, they suffer from two critical limitations: (1) the entanglement of content semantics and stylistic patterns in historical texts impedes accurate modeling of user-specific writing preferences; and (2) scalability challenges arising from both RAG's inference latency by retrieval operations and PEFT's parameter storage requirements for per user model. To overcome these limitations, we propose StyleVector, a training-free framework that disentangles and represents personalized writing style as a vector in LLM's activation space, enabling style-steered generation during inference without requiring costly retrieval or parameter storage. Comprehensive experiments demonstrate that our framework achieves a significant 8% relative improvement in personalized generation while reducing storage requirements by 1700 times over PEFT method.

View on arXiv
@article{zhang2025_2503.05213,
  title={ Personalized Text Generation with Contrastive Activation Steering },
  author={ Jinghao Zhang and Yuting Liu and Wenjie Wang and Qiang Liu and Shu Wu and Liang Wang and Tat-Seng Chua },
  journal={arXiv preprint arXiv:2503.05213},
  year={ 2025 }
}
Comments on this paper