ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2408.07295
24
10

Learning Multi-Modal Whole-Body Control for Real-World Humanoid Robots

30 July 2024
Pranay Dugar
Aayam Shrestha
Fangzhou Yu
Bart Jaap van Marum
Alan Fern
ArXivPDFHTML
Abstract

The foundational capabilities of humanoid robots should include robustly standing, walking, and mimicry of whole and partial-body motions. This work introduces the Masked Humanoid Controller (MHC), which supports all of these capabilities by tracking target trajectories over selected subsets of humanoid state variables while ensuring balance and robustness against disturbances. The MHC is trained in simulation using a carefully designed curriculum that imitates partially masked motions from a library of behaviors spanning standing, walking, optimized reference trajectories, re-targeted video clips, and human motion capture data. It also allows for combining joystick-based control with partial-body motion mimicry. We showcase simulation experiments validating the MHC's ability to execute a wide variety of behaviors from partially-specified target motions. Moreover, we demonstrate sim-to-real transfer on the real-world Digit V3 humanoid robot. To our knowledge, this is the first instance of a learned controller that can realize whole-body control of a real-world humanoid for such diverse multi-modal targets.

View on arXiv
@article{dugar2025_2408.07295,
  title={ Learning Multi-Modal Whole-Body Control for Real-World Humanoid Robots },
  author={ Pranay Dugar and Aayam Shrestha and Fangzhou Yu and Bart van Marum and Alan Fern },
  journal={arXiv preprint arXiv:2408.07295},
  year={ 2025 }
}
Comments on this paper