A Fine-Grained Attention Model for High Accuracy Operational Robot Guidance

Yinghao Chu, Daquan Feng*, Zuozhu Liu, Lei Zhang, Zizhou Zhao, Zhenzhong Wang, Zhiyong Feng, Xiang Gen Xia

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

8 引用 (Scopus)
Plum Print visual indicator of research metrics
  • Citations
    • Citation Indexes: 7
  • Captures
    • Readers: 17
  • Mentions
    • News Mentions: 1
see details

摘要

Deep learning enhanced Internet of Things (IoT) is advancing the transformation toward smart manufacturing. Intelligent robot guidance is one of the most potential deep learning + IoT applications in the manufacturing industry. However, low costs, efficient computing, and extremely high localization accuracy are mandatory requirements for vision robot guidance, particularly in operational factories. Therefore, in this work, a low-cost edge computing-based IoT system is developed based on an innovative fine-grained attention model (FGAM). FGAM integrates a deep-learning-based attention model to detect the region of interest (ROI) and an optimized conventional computer vision model to perform fine-grained localization concentrating on the ROI. Trained with only 100 images collected from real production line, the proposed FGAM has shown superior performance over multiple benchmark models when validated using operational data. Eventually, the FGAM-based edge computing system has been deployed on a welding robot in a real-world factory for mass production. After the assembly of about 6000 products, the deployed system has achieved averaged overall process and transmission time down to 200 ms and overall localization accuracy up to 99.998%.

源语言英语
页(从-至)1066-1081
页数16
期刊IEEE Internet of Things Journal
10
2
DOI
出版状态已出版 - 15 1月 2023
已对外发布

指纹

探究 'A Fine-Grained Attention Model for High Accuracy Operational Robot Guidance' 的科研主题。它们共同构成独一无二的指纹。

引用此

Chu, Y., Feng, D., Liu, Z., Zhang, L., Zhao, Z., Wang, Z., Feng, Z., & Xia, X. G. (2023). A Fine-Grained Attention Model for High Accuracy Operational Robot Guidance. IEEE Internet of Things Journal, 10(2), 1066-1081. https://doi.org/10.1109/JIOT.2022.3206388