Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2403.02274
Cited By
NatSGD: A Dataset with Speech, Gestures, and Demonstrations for Robot Learning in Natural Human-Robot Interaction
4 March 2024
Snehesh Shrestha
Yantian Zha
Saketh Banagiri
Ge Gao
Yiannis Aloimonos
Cornelia Fermuller
Re-assign community
ArXiv
PDF
HTML
Papers citing
"NatSGD: A Dataset with Speech, Gestures, and Demonstrations for Robot Learning in Natural Human-Robot Interaction"
7 / 7 papers shown
Title
ReLI: A Language-Agnostic Approach to Human-Robot Interaction
Linus Nwankwo
Bjoern Ellensohn
Ozan Özdenizci
Elmar Rueckert
LM&Ro
45
0
0
03 May 2025
doScenes: An Autonomous Driving Dataset with Natural Language Instruction for Human Interaction and Vision-Language Navigation
Parthib Roy
Srinivasa Perisetla
Shashank Shriram
Harsha Krishnaswamy
Aryan Keskar
Ross Greer
VGen
72
2
0
08 Dec 2024
Exploring 3D Human Pose Estimation and Forecasting from the Robot's Perspective: The HARPER Dataset
Andrea Avogaro
Andrea Toaiari
Federico Cunico
Xiangmin Xu
Haralambos Dafas
Alessandro Vinciarelli
Emma Li
Marco Cristani
33
3
0
21 Mar 2024
A Landmark-Aware Visual Navigation Dataset
Faith Johnson
Bryan Bo Cao
Kristin J. Dana
Shubham Jain
Ashwin Ashok
3DV
16
0
0
22 Feb 2024
TEACh: Task-driven Embodied Agents that Chat
Aishwarya Padmakumar
Jesse Thomason
Ayush Shrivastava
P. Lange
Anjali Narayan-Chen
Spandana Gella
Robinson Piramithu
Gökhan Tür
Dilek Z. Hakkani-Tür
LM&Ro
145
179
0
01 Oct 2021
Multimodal analysis of the predictability of hand-gesture properties
Taras Kucherenko
Rajmund Nagy
Michael Neff
Hedvig Kjellström
G. Henter
22
22
0
12 Aug 2021
Language-Conditioned Imitation Learning for Robot Manipulation Tasks
Simon Stepputtis
Joseph Campbell
Mariano Phielipp
Stefan Lee
Chitta Baral
H. B. Amor
LM&Ro
111
192
0
22 Oct 2020
1