首页> 外文会议>Annual Conference on Towards Autonomous Robotic Systems >GarmNet: Improving Global with Local Perception for Robotic Laundry Folding
【24h】

GarmNet: Improving Global with Local Perception for Robotic Laundry Folding

机译:GarmEnt:通过本地感知对机器人洗衣折叠进行全球改进

获取原文

摘要

Developing autonomous assistants to help with domestic tasks is a vital topic in robotics research. Among these tasks, garment folding is one of them that is still far from being achieved mainly due to the large number of possible configurations that a crumpled piece of clothing may exhibit. Research has been done on either estimating the pose of the garment as a whole or detecting the landmarks for grasping separately. However, such works constrain the capability of the robots to perceive the states of the garment by limiting the representations for one single task. In this paper, we propose a novel end-to-end deep learning model named GarmNet that is able to simultaneously localize the garment and detect landmarks for grasping. The localization of the garment represents the global information for recognising the category of the garment, whereas the detection of landmarks can facilitate subsequent grasping actions. We train and evaluate our proposed GarmNet model using the CloPeMa Garment dataset that contains 3,330 images of different garment types in different poses. The experiments show that the inclusion of landmark detection (GarmNet-B) can largely improve the garment localization, with an error rate of 24.7% lower. Solutions as ours are important for robotics applications, as these offer scalable to many classes, memory and processing efficient, solutions.
机译:培养自主助理以帮助完成家庭任务是机器人技术研究中的重要课题。在这些任务中,服装折叠是其中之一,主要由于皱折的衣服可能表现出大量可能的构造,因此仍远未实现。已经对估计整体服装的姿势或检测标志物以进行单独抓取进行了研究。但是,这样的工作通过限制单个任务的表示来限制机器人感知服装状态的能力。在本文中,我们提出了一种名为GarmNet的新型端到端深度学习模型,该模型能够同时定位服装并检测要抓取的标志。服装的定位代表用于识别服装类别的全局信息,而地标的检测可以促进后续的抓握动作。我们使用CloPeMa服装数据集训练和评估我们建议的GarmNet模型,该数据集包含3,330张不同姿势的不同服装类型的图像。实验表明,加入界标检测(GarmNet-B)可以大大改善服装定位,错误率降低24.7%。像我们这样的解决方案对于机器人应用很重要,因为这些解决方案可扩展到许多类,内存和处理效率高的解决方案。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号