TY - GEN
T1 - Multi-Robot Collaborative Reasoning for Unique Person Recognition in Complex Environments
AU - Yang, Chule
AU - Yue, Yufeng
AU - Wen, Mingxing
AU - Wang, Yuanzhe
AU - Deng, Baosong
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2020/12/13
Y1 - 2020/12/13
N2 - The discovery of unique or suspicious people is essential for active surveillance of security or patrol robots, and multi-robot collaboration and dynamic reasoning can further enhance their adaptability in large-scale environments. This paper proposes a hierarchical probabilistic reasoning framework for a multi-robot system to actively identify the unique person with distinct motion patterns in large-scale and dynamic environments. Linear and angular velocities are considered typical motion patterns, which are extracted by using heterogeneous sensors to detect and track people. First, single robot reasoning is performed, each robot judges the uniqueness of people by comparing their motion patterns based on local observations. Meanwhile, multi-robot reasoning is also performed, by fusing the perceptual information from each individual robot to form a global observation and then make another judgment based on it. Finally, each robot can decide which result should be adopted by comparing the beliefs of local and global judgments. Experimental results show that the method is feasible in various environments.
AB - The discovery of unique or suspicious people is essential for active surveillance of security or patrol robots, and multi-robot collaboration and dynamic reasoning can further enhance their adaptability in large-scale environments. This paper proposes a hierarchical probabilistic reasoning framework for a multi-robot system to actively identify the unique person with distinct motion patterns in large-scale and dynamic environments. Linear and angular velocities are considered typical motion patterns, which are extracted by using heterogeneous sensors to detect and track people. First, single robot reasoning is performed, each robot judges the uniqueness of people by comparing their motion patterns based on local observations. Meanwhile, multi-robot reasoning is also performed, by fusing the perceptual information from each individual robot to form a global observation and then make another judgment based on it. Finally, each robot can decide which result should be adopted by comparing the beliefs of local and global judgments. Experimental results show that the method is feasible in various environments.
UR - http://www.scopus.com/inward/record.url?scp=85100103006&partnerID=8YFLogxK
U2 - 10.1109/ICARCV50220.2020.9305425
DO - 10.1109/ICARCV50220.2020.9305425
M3 - Conference contribution
AN - SCOPUS:85100103006
T3 - 16th IEEE International Conference on Control, Automation, Robotics and Vision, ICARCV 2020
SP - 369
EP - 374
BT - 16th IEEE International Conference on Control, Automation, Robotics and Vision, ICARCV 2020
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 16th IEEE International Conference on Control, Automation, Robotics and Vision, ICARCV 2020
Y2 - 13 December 2020 through 15 December 2020
ER -