ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.03046
15
0

Sim2Real Transfer for Vision-Based Grasp Verification

5 May 2025
Pau Amargant
Peter Honig
Markus Vincze
ArXivPDFHTML
Abstract

The verification of successful grasps is a crucial aspect of robot manipulation, particularly when handling deformable objects. Traditional methods relying on force and tactile sensors often struggle with deformable and non-rigid objects. In this work, we present a vision-based approach for grasp verification to determine whether the robotic gripper has successfully grasped an object. Our method employs a two-stage architecture; first YOLO-based object detection model to detect and locate the robot's gripper and then a ResNet-based classifier determines the presence of an object. To address the limitations of real-world data capture, we introduce HSR-GraspSynth, a synthetic dataset designed to simulate diverse grasping scenarios. Furthermore, we explore the use of Visual Question Answering capabilities as a zero-shot baseline to which we compare our model. Experimental results demonstrate that our approach achieves high accuracy in real-world environments, with potential for integration into grasping pipelines. Code and datasets are publicly available atthis https URL.

View on arXiv
@article{amargant2025_2505.03046,
  title={ Sim2Real Transfer for Vision-Based Grasp Verification },
  author={ Pau Amargant and Peter Hönig and Markus Vincze },
  journal={arXiv preprint arXiv:2505.03046},
  year={ 2025 }
}
Comments on this paper