Calibration and implementation of a novel omnidirectional vision system for robot perception

Chang Li, Qing Shi, Chunbao Wang, Qiang Huang, Toshio Fukuda

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Citation (Scopus)

Abstract

Motivated by the demand of detecting surroundings for a humanoid robot, we developed an omnidirectional vision system for robot perception (OVROP) with 5 Degrees of Freedom (DOFs). OVROP is mainly composed by three components: hardware, control architecture and image processing. Endowed with compatible hardware and software interfaces, OVROP can be applied to a variety of robots. With the utilization of black-box algorithm (BBA) and cylindrical coordinate transformation (CCT), OVROP allows undistorted omnidirectional perception of static surroundings. For the omnidirectional tracking of moving objects, subdivision method has been used and the experimental results prove its high efficiency. We have also confirmed that OVROP can perform 3D reconstruction of a desired target within 50ms. Thus OVROP is able to provide detailed information of surrounding environment with full range in real time for robot perception.

Original languageEnglish
Title of host publication2016 IEEE International Conference on Robotics and Biomimetics, ROBIO 2016
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages589-594
Number of pages6
ISBN (Electronic)9781509043644
DOIs
Publication statusPublished - 2016
Event2016 IEEE International Conference on Robotics and Biomimetics, ROBIO 2016 - Qingdao, China
Duration: 3 Dec 20167 Dec 2016

Publication series

Name2016 IEEE International Conference on Robotics and Biomimetics, ROBIO 2016

Conference

Conference2016 IEEE International Conference on Robotics and Biomimetics, ROBIO 2016
Country/TerritoryChina
CityQingdao
Period3/12/167/12/16

Fingerprint

Dive into the research topics of 'Calibration and implementation of a novel omnidirectional vision system for robot perception'. Together they form a unique fingerprint.

Cite this