ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.19638
52
2

Sensor-Invariant Tactile Representation

27 February 2025
Harsh Gupta
Yuchen Mo
Shengmiao Jin
Wenzhen Yuan
ArXivPDFHTML
Abstract

High-resolution tactile sensors have become critical for embodied perception and robotic manipulation. However, a key challenge in the field is the lack of transferability between sensors due to design and manufacturing variations, which result in significant differences in tactile signals. This limitation hinders the ability to transfer models or knowledge learned from one sensor to another. To address this, we introduce a novel method for extracting Sensor-Invariant Tactile Representations (SITR), enabling zero-shot transfer across optical tactile sensors. Our approach utilizes a transformer-based architecture trained on a diverse dataset of simulated sensor designs, allowing it to generalize to new sensors in the real world with minimal calibration. Experimental results demonstrate the method's effectiveness across various tactile sensing applications, facilitating data and model transferability for future advancements in the field.

View on arXiv
@article{gupta2025_2502.19638,
  title={ Sensor-Invariant Tactile Representation },
  author={ Harsh Gupta and Yuchen Mo and Shengmiao Jin and Wenzhen Yuan },
  journal={arXiv preprint arXiv:2502.19638},
  year={ 2025 }
}
Comments on this paper