RHanDS: Refining Malformed Hands for Generated Images with Decoupled Structure and Style Guidance

Although diffusion models can generate high-quality human images, their applications are limited by the instability in generating hands with correct structures. In this paper, we introduce RHanDS, a conditional diffusion-based framework designed to refine malformed hands by utilizing decoupled structure and style guidance. The hand mesh reconstructed from the malformed hand offers structure guidance for correcting the structure of the hand, while the malformed hand itself provides style guidance for preserving the style of the hand. To alleviate the mutual interference between style and structure guidance, we introduce a two-stage training strategy and build a series of multi-style hand datasets. In the first stage, we use paired hand images for training to ensure stylistic consistency in hand refining. In the second stage, various hand images generated based on human meshes are used for training, enabling the model to gain control over the hand structure. Experimental results demonstrate that RHanDS can effectively refine hand structure while preserving consistency in hand style.
View on arXiv@article{wang2025_2404.13984, title={ RHanDS: Refining Malformed Hands for Generated Images with Decoupled Structure and Style Guidance }, author={ Chengrui Wang and Pengfei Liu and Min Zhou and Ming Zeng and Xubin Li and Tiezheng Ge and Bo zheng }, journal={arXiv preprint arXiv:2404.13984}, year={ 2025 } }