Reorienting objects using extrinsic supporting items on the working platform is a meaningful, nonetheless challenging manipulation task, considering the elaborate geometry of the objects and the robot's feasible motions. In this work, we propose a pipeline using the RGBD camera's perception results to predict objects' stable placements afforded by supporting items, including a generation stage, a refinement stage, and a classification stage. Then, we construct manipulation graphs that enclose shared grasp configurations to transform objects' stable placements. The robot can reorient objects through sequential pick-and-place operations based on the manipulation graphs. We show in experiments that our approach is effective and efficient. The simulation experiments demonstrate that our pipeline can generalize to novel objects in random start poses on the working platform, generating diverse placements with high accuracy. Moreover, the manipulation graphs are conducive to providing collision-free motions for the robot to reorient objects. We also employ a robot in real-world experiments to perform sequential pick-and-place operations, indicating that our method can transfer objects' placement poses in real scenes.
translated by 谷歌翻译