ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2512.16842
48
0

OPENTOUCH: Bringing Full-Hand Touch to Real-World Interaction

18 December 2025
Yuxin Ray Song
Jinzhou Li
Rao Fu
Devin Murphy
Kaichen Zhou
Rishi Shiv
Yaqi Li
Haoyu Xiong
Crystal Elaine Owens
Yilun Du
Yiyue Luo
Xianyi Cheng
Antonio Torralba
Wojciech Matusik
Paul Pu Liang
ArXiv (abs)PDFHTML
Main:8 Pages
11 Figures
Bibliography:4 Pages
7 Tables
Appendix:11 Pages
Abstract

The human hand is our primary interface to the physical world, yet egocentric perception rarely knows when, where, or how forcefully it makes contact. Robust wearable tactile sensors are scarce, and no existing in-the-wild datasets align first-person video with full-hand touch. To bridge the gap between visual perception and physical interaction, we present OpenTouch, the first in-the-wild egocentric full-hand tactile dataset, containing 5.1 hours of synchronized video-touch-pose data and 2,900 curated clips with detailed text annotations. Using OpenTouch, we introduce retrieval and classification benchmarks that probe how touch grounds perception and action. We show that tactile signals provide a compact yet powerful cue for grasp understanding, strengthen cross-modal alignment, and can be reliably retrieved from in-the-wild video queries. By releasing this annotated vision-touch-pose dataset and benchmark, we aim to advance multimodal egocentric perception, embodied learning, and contact-rich robotic manipulation.

View on arXiv
Comments on this paper