Learning Object Compliance via Young's Modulus from Single Grasps using Camera-Based Tactile Sensors

Compliance is a useful parametrization of tactile information that humans often utilize in manipulation tasks. It can be used to inform low-level contact-rich actions or characterize objects at a high-level. In robotic manipulation, existing approaches to estimate compliance have struggled to generalize across both object shape and material. Using camera-based tactile sensors, proprioception, and force measurements, we present a novel approach to estimate object compliance as Young's modulus (E) from parallel grasps. We evaluate our method over a novel dataset of 285 common objects, including a wide array of shapes and materials with Young's moduli ranging from 5.0 kPa to 250 GPa. Combining analytical and data-driven approaches, we develop a hybrid system using a multi-tower neural network to analyze a sequence of tactile images from grasping. This system is shown to estimate the Young's modulus of unseen objects within an order of magnitude at 74.2% accuracy across our dataset. This is an improvement over purely analytical and data-driven baselines which exhibit 28.9% and 65.0% accuracy respectively. Importantly, this estimation system performs irrespective of object geometry and demonstrates increased robustness across material types.
View on arXiv@article{burgess2025_2406.15304, title={ Learning Object Compliance via Young's Modulus from Single Grasps using Camera-Based Tactile Sensors }, author={ Michael Burgess and Jialiang Zhao and Laurence Willemet }, journal={arXiv preprint arXiv:2406.15304}, year={ 2025 } }