ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.04557
47
0

Learning Generalizable Language-Conditioned Cloth Manipulation from Long Demonstrations

6 March 2025
Hanyi Zhao
Jinxuan Zhu
Zihao Yan
Yichen Li
Yuhong Deng
Xueqian Wang
    SSL
ArXivPDFHTML
Abstract

Multi-step cloth manipulation is a challenging problem for robots due to the high-dimensional state spaces and the dynamics of cloth. Despite recent significant advances in end-to-end imitation learning for multi-step cloth manipulation skills, these methods fail to generalize to unseen tasks. Our insight in tackling the challenge of generalizable multi-step cloth manipulation is decomposition. We propose a novel pipeline that autonomously learns basic skills from long demonstrations and composes learned basic skills to generalize to unseen tasks. Specifically, our method first discovers and learns basic skills from the existing long demonstration benchmark with the commonsense knowledge of a large language model (LLM). Then, leveraging a high-level LLM-based task planner, these basic skills can be composed to complete unseen tasks. Experimental results demonstrate that our method outperforms baseline methods in learning multi-step cloth manipulation skills for both seen and unseen tasks.

View on arXiv
@article{zhao2025_2503.04557,
  title={ Learning Generalizable Language-Conditioned Cloth Manipulation from Long Demonstrations },
  author={ Hanyi Zhao and Jinxuan Zhu and Zihao Yan and Yichen Li and Yuhong Deng and Xueqian Wang },
  journal={arXiv preprint arXiv:2503.04557},
  year={ 2025 }
}
Comments on this paper