21
0

A Comparative Study of Human Activity Recognition: Motion, Tactile, and multi-modal Approaches

Abstract

Human activity recognition (HAR) is essential for effective Human-Robot Collaboration (HRC), enabling robots to interpret and respond to human actions. This study evaluates the ability of a vision-based tactile sensor to classify 15 activities, comparing its performance to an IMU-based data glove. Additionally, we propose a multi-modal framework combining tactile and motion data to leverage their complementary strengths. We examined three approaches: motion-based classification (MBC) using IMU data, tactile-based classification (TBC) with single or dual video streams, and multi-modal classification (MMC) integrating both. Offline validation on segmented datasets assessed each configuration's accuracy under controlled conditions, while online validation on continuous action sequences tested online performance. Results showed the multi-modal approach consistently outperformed single-modality methods, highlighting the potential of integrating tactile and motion sensing to enhance HAR systems for collaborative robotics.

View on arXiv
@article{belcamino2025_2505.08657,
  title={ A Comparative Study of Human Activity Recognition: Motion, Tactile, and multi-modal Approaches },
  author={ Valerio Belcamino and Nhat Minh Dinh Le and Quan Khanh Luu and Alessandro Carfì and Van Anh Ho and Fulvio Mastrogiovanni },
  journal={arXiv preprint arXiv:2505.08657},
  year={ 2025 }
}
Comments on this paper