Bring Your Own Grasp Generator: Leveraging Robot Grasp Generation for Prosthetic Grasping
One of the most important research challenges in upper-limb prosthetics is enhancing the user-prosthesis communication to closely resemble the experience of a natural limb. As prosthetic devices become more complex, users often struggle to control the additional degrees of freedom. In this context, leveraging shared-autonomy principles can significantly improve the usability of these systems. In this paper, we present a novel eye-in-hand prosthetic grasping system that follows these principles. Our system initiates the approach-to-grasp action based on user's command and automatically configures the DoFs of a prosthetic hand. First, it reconstructs the 3D geometry of the target object without the need of a depth camera. Then, it tracks the hand motion during the approach-to-grasp action and finally selects a candidate grasp configuration according to user's intentions. We deploy our system on the Hannes prosthetic hand and test it on able-bodied subjects and amputees to validate its effectiveness. We compare it with a multi-DoF prosthetic control baseline and find that our method enables faster grasps, while simplifying the user experience. Code and demo videos are available online atthis https URL.
View on arXiv@article{stracquadanio2025_2503.00466, title={ Bring Your Own Grasp Generator: Leveraging Robot Grasp Generation for Prosthetic Grasping }, author={ Giuseppe Stracquadanio and Federico Vasile and Elisa Maiettini and Nicolò Boccardo and Lorenzo Natale }, journal={arXiv preprint arXiv:2503.00466}, year={ 2025 } }