This paper introduces RoTipBot, a novel robotic system for handling thin, flexible objects. Different from previous works that are limited to singulating them using suction cups or soft grippers, RoTipBot can count multiple layers and then grasp them simultaneously in a single grasp closure. Specifically, we first develop a vision-based tactile sensor named RoTip that can rotate and sense contact information around its tip. Equipped with two RoTip sensors, RoTipBot rolls and feeds multiple layers of thin, flexible objects into the centre between its fingers, enabling effective grasping. Moreover, we design a tactile-based grasping strategy that uses RoTip's sensing ability to ensure both fingers maintain secure contact with the object while accurately counting the number of fed objects. Extensive experiments demonstrate the efficacy of the RoTip sensor and the RoTipBot approach. The results show that RoTipBot not only achieves a higher success rate but also grasps and counts multiple layers simultaneously -- capabilities not possible with previous methods. Furthermore, RoTipBot operates up to three times faster than state-of-the-art methods. The success of RoTipBot paves the way for future research in object manipulation using mobilised tactile sensors. All the materials used in this paper are available atthis https URL.
View on arXiv@article{jiang2025_2406.09332, title={ RoTipBot: Robotic Handling of Thin and Flexible Objects using Rotatable Tactile Sensors }, author={ Jiaqi Jiang and Xuyang Zhang and Daniel Fernandes Gomes and Thanh-Toan Do and Shan Luo }, journal={arXiv preprint arXiv:2406.09332}, year={ 2025 } }