ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.08978
62
0

TetraGrip: Sensor-Driven Multi-Suction Reactive Object Manipulation in Cluttered Scenes

13 March 2025
Paolo Torrado
Joshua Levin
Markus Grotz
Joshua R. Smith
ArXivPDFHTML
Abstract

Warehouse robotic systems equipped with vacuum grippers must reliably grasp a diverse range of objects from densely packed shelves. However, these environments present significant challenges, including occlusions, diverse object orientations, stacked and obstructed items, and surfaces that are difficult to suction. We introduce \tetra, a novel vacuum-based grasping strategy featuring four suction cups mounted on linear actuators. Each actuator is equipped with an optical time-of-flight (ToF) proximity sensor, enabling reactive grasping.We evaluate \tetra in a warehouse-style setting, demonstrating its ability to manipulate objects in stacked and obstructed configurations. Our results show that our RL-based policy improves picking success in stacked-object scenarios by 22.86\% compared to a single-suction gripper. Additionally, we demonstrate that TetraGrip can successfully grasp objects in scenarios where a single-suction gripper fails due to physical limitations, specifically in two cases: (1) picking an object occluded by another object and (2) retrieving an object in a complex scenario. These findings highlight the advantages of multi-actuated, suction-based grasping in unstructured warehouse environments. The project website is available at: \href{this https URL}{this https URL}.

View on arXiv
@article{torrado2025_2503.08978,
  title={ TetraGrip: Sensor-Driven Multi-Suction Reactive Object Manipulation in Cluttered Scenes },
  author={ Paolo Torrado and Joshua Levin and Markus Grotz and Joshua Smith },
  journal={arXiv preprint arXiv:2503.08978},
  year={ 2025 }
}
Comments on this paper