DyTact: Capturing Dynamic Contacts in Hand-Object Manipulation

Reconstructing dynamic hand-object contacts is essential for realistic manipulation in AI character animation, XR, and robotics, yet it remains challenging due to heavy occlusions, complex surface details, and limitations in existing capture techniques. In this paper, we introduce DyTact, a markerless capture method for accurately capturing dynamic contact in hand-object manipulations in a non-intrusive manner. Our approach leverages a dynamic, articulated representation based on 2D Gaussian surfels to model complex manipulations. By binding these surfels to MANO meshes, DyTact harnesses the inductive bias of template models to stabilize and accelerate optimization. A refinement module addresses time-dependent high-frequency deformations, while a contact-guided adaptive sampling strategy selectively increases surfel density in contact regions to handle heavy occlusion. Extensive experiments demonstrate that DyTact not only achieves state-of-the-art dynamic contact estimation accuracy but also significantly improves novel view synthesis quality, all while operating with fast optimization and efficient memory usage. Project Page:this https URL.
View on arXiv@article{cong2025_2506.03103, title={ DyTact: Capturing Dynamic Contacts in Hand-Object Manipulation }, author={ Xiaoyan Cong and Angela Xing and Chandradeep Pokhariya and Rao Fu and Srinath Sridhar }, journal={arXiv preprint arXiv:2506.03103}, year={ 2025 } }