EGNet: Efficient Robotic Grasp Detection Network

Sheng Yu, Di Hua Zhai*, Yuanqing Xia

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

12 引用 (Scopus)

摘要

In this article, a novel grasp detection network, efficient grasp detection network (EGNet), is proposed to deal with the grasp challenging in stacked scenes, which complete the tasks of the object detection, grasp detection, and manipulation relationship detection. On the object detection, the EGNet takes the idea from the EfficientDet, and some hyperparameters are modified to help the robot complete the task of object detection and classification. In the part of grasping detection, a novel grasp detection module is proposed, which takes the feature map from bidirectional feature pyramid network (BiFPN) as input, and outputs the grasp position and its quality score. In the part of manipulation relation analysis, it takes the feature map from BiFPN, object detection, and the grasp detection, and outputs the best grasping position and appropriate manipulation relationship. The EGNet is trained and tested on the visual manipulation relationship dataset and Cornell dataset, and the detection accuracy are 87.1% and 98.9%, respectively. Finally, the EGNet is also tested in the practical scene by a grasp experiment on the Baxter robot. The grasp experiment is performed in the cluttered and stacked scene, and gets the success rate of 93.6% and 69.6%, respectively.

源语言英语
页(从-至)4058-4067
页数10
期刊IEEE Transactions on Industrial Electronics
70
4
DOI
出版状态已出版 - 1 4月 2023

指纹

探究 'EGNet: Efficient Robotic Grasp Detection Network' 的科研主题。它们共同构成独一无二的指纹。

引用此