EGNet: Efficient Robotic Grasp Detection Network

Sheng Yu, Di Hua Zhai*, Yuanqing Xia

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

13 Citations (Scopus)

Abstract

In this article, a novel grasp detection network, efficient grasp detection network (EGNet), is proposed to deal with the grasp challenging in stacked scenes, which complete the tasks of the object detection, grasp detection, and manipulation relationship detection. On the object detection, the EGNet takes the idea from the EfficientDet, and some hyperparameters are modified to help the robot complete the task of object detection and classification. In the part of grasping detection, a novel grasp detection module is proposed, which takes the feature map from bidirectional feature pyramid network (BiFPN) as input, and outputs the grasp position and its quality score. In the part of manipulation relation analysis, it takes the feature map from BiFPN, object detection, and the grasp detection, and outputs the best grasping position and appropriate manipulation relationship. The EGNet is trained and tested on the visual manipulation relationship dataset and Cornell dataset, and the detection accuracy are 87.1% and 98.9%, respectively. Finally, the EGNet is also tested in the practical scene by a grasp experiment on the Baxter robot. The grasp experiment is performed in the cluttered and stacked scene, and gets the success rate of 93.6% and 69.6%, respectively.

Original languageEnglish
Pages (from-to)4058-4067
Number of pages10
JournalIEEE Transactions on Industrial Electronics
Volume70
Issue number4
DOIs
Publication statusPublished - 1 Apr 2023

Keywords

  • Grasping detection
  • manipulation relationship detection
  • object detection
  • robot

Fingerprint

Dive into the research topics of 'EGNet: Efficient Robotic Grasp Detection Network'. Together they form a unique fingerprint.

Cite this