4
0

Emergent Active Perception and Dexterity of Simulated Humanoids from Visual Reinforcement Learning

Abstract

Human behavior is fundamentally shaped by visual perception -- our ability to interact with the world depends on actively gathering relevant information and adapting our movements accordingly. Behaviors like searching for objects, reaching, and hand-eye coordination naturally emerge from the structure of our sensory system. Inspired by these principles, we introduce Perceptive Dexterous Control (PDC), a framework for vision-driven dexterous whole-body control with simulated humanoids. PDC operates solely on egocentric vision for task specification, enabling object search, target placement, and skill selection through visual cues, without relying on privileged state information (e.g., 3D object positions and geometries). This perception-as-interface paradigm enables learning a single policy to perform multiple household tasks, including reaching, grasping, placing, and articulated object manipulation. We also show that training from scratch with reinforcement learning can produce emergent behaviors such as active search. These results demonstrate how vision-driven control and complex tasks induce human-like behaviors and can serve as the key ingredients in closing the perception-action loop for animation, robotics, and embodied AI.

View on arXiv
@article{luo2025_2505.12278,
  title={ Emergent Active Perception and Dexterity of Simulated Humanoids from Visual Reinforcement Learning },
  author={ Zhengyi Luo and Chen Tessler and Toru Lin and Ye Yuan and Tairan He and Wenli Xiao and Yunrong Guo and Gal Chechik and Kris Kitani and Linxi Fan and Yuke Zhu },
  journal={arXiv preprint arXiv:2505.12278},
  year={ 2025 }
}
Comments on this paper