ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.04771
16
27

Learning Visual Feedback Control for Dynamic Cloth Folding

10 September 2021
Julius Hietala
David Blanco Mulero
G. Alcan
Ville Kyrki
ArXivPDFHTML
Abstract

Robotic manipulation of cloth is a challenging task due to the high dimensionality of the configuration space and the complexity of dynamics affected by various material properties. The effect of complex dynamics is even more pronounced in dynamic folding, for example, when a square piece of fabric is folded in two by a single manipulator. To account for the complexity and uncertainties, feedback of the cloth state using e.g. vision is typically needed. However, construction of visual feedback policies for dynamic cloth folding is an open problem. In this paper, we present a solution that learns policies in simulation using Reinforcement Learning (RL) and transfers the learned policies directly to the real world. In addition, to learn a single policy that manipulates multiple materials, we randomize the material properties in simulation. We evaluate the contributions of visual feedback and material randomization in real-world experiments. The experimental results demonstrate that the proposed solution can fold successfully different fabric types using dynamic manipulation in the real world. Code, data, and videos are available at https://sites.google.com/view/dynamic-cloth-folding

View on arXiv
Comments on this paper