ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.12101
29
1

MUSE: A Real-Time Multi-Sensor State Estimator for Quadruped Robots

15 March 2025
Ylenia Nisticò
J. C. V. Soares
Lorenzo Amatucci
Geoff Fink
Claudio Semini
ArXivPDFHTML
Abstract

This paper introduces an innovative state estimator, MUSE (MUlti-sensor State Estimator), designed to enhance state estimation's accuracy and real-time performance in quadruped robot navigation. The proposed state estimator builds upon our previous work presented in [1]. It integrates data from a range of onboard sensors, including IMUs, encoders, cameras, and LiDARs, to deliver a comprehensive and reliable estimation of the robot's pose and motion, even in slippery scenarios. We tested MUSE on a Unitree Aliengo robot, successfully closing the locomotion control loop in difficult scenarios, including slippery and uneven terrain. Benchmarking against Pronto [2] and VILENS [3] showed 67.6% and 26.7% reductions in translational errors, respectively. Additionally, MUSE outperformed DLIO [4], a LiDAR-inertial odometry system in rotational errors and frequency, while the proprioceptive version of MUSE (P-MUSE) outperformed TSIF [5], with a 45.9% reduction in absolute trajectory error (ATE).

View on arXiv
@article{nisticò2025_2503.12101,
  title={ MUSE: A Real-Time Multi-Sensor State Estimator for Quadruped Robots },
  author={ Ylenia Nisticò and João Carlos Virgolino Soares and Lorenzo Amatucci and Geoff Fink and Claudio Semini },
  journal={arXiv preprint arXiv:2503.12101},
  year={ 2025 }
}
Comments on this paper