Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2410.10624
Cited By
SensorLLM: Aligning Large Language Models with Motion Sensors for Human Activity Recognition
14 October 2024
Zechen Li
Shohreh Deldari
Linyao Chen
Hao Xue
Flora D. Salim
Re-assign community
ArXiv
PDF
HTML
Papers citing
"SensorLLM: Aligning Large Language Models with Motion Sensors for Human Activity Recognition"
5 / 5 papers shown
Title
EgoCHARM: Resource-Efficient Hierarchical Activity Recognition using an Egocentric IMU Sensor
Akhil Padmanabha
Saravanan Govindarajan
Hwanmun Kim
Sergio Ortiz
Rahul Rajan
Doruk Senkal
Sneha Kadetotad
20
0
0
24 Apr 2025
RadarLLM: Empowering Large Language Models to Understand Human Motion from Millimeter-wave Point Cloud Sequence
Zengyuan Lai
Jiarui Yang
Songpengcheng Xia
Lizhou Lin
Lan Sun
Renwen Wang
J. Liu
Qi Wu
Ling Pei
26
0
0
14 Apr 2025
COMODO: Cross-Modal Video-to-IMU Distillation for Efficient Egocentric Human Activity Recognition
Baiyu Chen
Wilson Wongso
Zechen Li
Yonchanok Khaokaew
Hao Xue
Flora D. Salim
51
0
0
10 Mar 2025
Mojito: LLM-Aided Motion Instructor with Jitter-Reduced Inertial Tokens
Ziwei Shan
Yaoyu He
Chengfeng Zhao
Jiashen Du
Jingyan Zhang
Qixuan Zhang
Jingyi Yu
Lan Xu
38
1
0
22 Feb 2025
EgoHand: Ego-centric Hand Pose Estimation and Gesture Recognition with Head-mounted Millimeter-wave Radar and IMUs
Yizhe Lv
Tingting Zhang
Yunpeng Song
H. Ding
Jinsong Han
Fei-Yue Wang
31
1
0
23 Jan 2025
1