ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.03369
34
0

Point Cloud-based Grasping for Soft Hand Exoskeleton

4 April 2025
Chen Hu
Enrica Tricomi
Eojin Rho
Daekyum Kim
L. Masia
Shan Luo
Letizia Gionfrida
ArXivPDFHTML
Abstract

Grasping is a fundamental skill for interacting with and manipulating objects in the environment. However, this ability can be challenging for individuals with hand impairments. Soft hand exoskeletons designed to assist grasping can enhance or restore essential hand functions, yet controlling these soft exoskeletons to support users effectively remains difficult due to the complexity of understanding the environment. This study presents a vision-based predictive control framework that leverages contextual awareness from depth perception to predict the grasping target and determine the next control state for activation. Unlike data-driven approaches that require extensive labelled datasets and struggle with generalizability, our method is grounded in geometric modelling, enabling robust adaptation across diverse grasping scenarios. The Grasping Ability Score (GAS) was used to evaluate performance, with our system achieving a state-of-the-art GAS of 91% across 15 objects and healthy participants, demonstrating its effectiveness across different object types. The proposed approach maintained reconstruction success for unseen objects, underscoring its enhanced generalizability compared to learning-based models.

View on arXiv
@article{hu2025_2504.03369,
  title={ Point Cloud-based Grasping for Soft Hand Exoskeleton },
  author={ Chen Hu and Enrica Tricomi and Eojin Rho and Daekyum Kim and Lorenzo Masia and Shan Luo and Letizia Gionfrida },
  journal={arXiv preprint arXiv:2504.03369},
  year={ 2025 }
}
Comments on this paper