ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.15535
17
0

VibeCheck: Using Active Acoustic Tactile Sensing for Contact-Rich Manipulation

22 April 2025
Kaidi Zhang
Do-Gon Kim
Eric Chang
Hua-Hsuan Liang
Zhanpeng He
Kathryn Lampo
Philippe Wu
Ioannis Kymissis
M. Ciocarlie
ArXivPDFHTML
Abstract

The acoustic response of an object can reveal a lot about its global state, for example its material properties or the extrinsic contacts it is making with the world. In this work, we build an active acoustic sensing gripper equipped with two piezoelectric fingers: one for generating signals, the other for receiving them. By sending an acoustic vibration from one finger to the other through an object, we gain insight into an object's acoustic properties and contact state. We use this system to classify objects, estimate grasping position, estimate poses of internal structures, and classify the types of extrinsic contacts an object is making with the environment. Using our contact type classification model, we tackle a standard long-horizon manipulation problem: peg insertion. We use a simple simulated transition model based on the performance of our sensor to train an imitation learning policy that is robust to imperfect predictions from the classifier. We finally demonstrate the policy on a UR5 robot with active acoustic sensing as the only feedback.

View on arXiv
@article{zhang2025_2504.15535,
  title={ VibeCheck: Using Active Acoustic Tactile Sensing for Contact-Rich Manipulation },
  author={ Kaidi Zhang and Do-Gon Kim and Eric T. Chang and Hua-Hsuan Liang and Zhanpeng He and Kathryn Lampo and Philippe Wu and Ioannis Kymissis and Matei Ciocarlie },
  journal={arXiv preprint arXiv:2504.15535},
  year={ 2025 }
}
Comments on this paper