TY - JOUR
T1 - LCDL
T2 - Toward Dynamic Localization for Autonomous Landing of Unmanned Aerial Vehicle Based on LiDAR-Camera Fusion
AU - Xu, Yongkang
AU - Chen, Zhihua
AU - Deng, Chencheng
AU - Wang, Shoukun
AU - Wang, Junzheng
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - The detection, localization, and tracking of unmanned aerial vehicles (UAVs) are pivotal for ensuring reliable decision-making and intelligent control in scenarios involving heterogeneous agent cooperation. A dynamic localization framework with asynchronous LiDAR-camera fusion is considered in this article, which is used to provide absolute attitude and position observations of UAV and to achieve robust localization in outdoor environments. First, a fast search architecture based on the depth cluster is presented to transform point clouds into distance images and establish distance image target extraction based on any two neighboring points. Besides, a neural network framework is introduced for the recognition of UAVs, where the feature maps are fed into a region suggestion network to obtain optimal suggestions for object classification and bounding box regression. Furthermore, we designed a dual servo turntable integrated with multisensors to dynamically track the coordinates of the UAVs, ensuring that the vehicle remains centered within the detection area at all times. Finally, the heterogeneous agent is employed to evaluate the localization performance of UAVs in real-world situations. This indicates that asynchronous LiDAR-camera fusion can run fully on embedded devices and productively in heterogeneous agent systems.
AB - The detection, localization, and tracking of unmanned aerial vehicles (UAVs) are pivotal for ensuring reliable decision-making and intelligent control in scenarios involving heterogeneous agent cooperation. A dynamic localization framework with asynchronous LiDAR-camera fusion is considered in this article, which is used to provide absolute attitude and position observations of UAV and to achieve robust localization in outdoor environments. First, a fast search architecture based on the depth cluster is presented to transform point clouds into distance images and establish distance image target extraction based on any two neighboring points. Besides, a neural network framework is introduced for the recognition of UAVs, where the feature maps are fed into a region suggestion network to obtain optimal suggestions for object classification and bounding box regression. Furthermore, we designed a dual servo turntable integrated with multisensors to dynamically track the coordinates of the UAVs, ensuring that the vehicle remains centered within the detection area at all times. Finally, the heterogeneous agent is employed to evaluate the localization performance of UAVs in real-world situations. This indicates that asynchronous LiDAR-camera fusion can run fully on embedded devices and productively in heterogeneous agent systems.
KW - Asynchronous sensor
KW - dynamic localization
KW - information fusion
KW - unmanned aerial vehicles (UAVs)
UR - http://www.scopus.com/inward/record.url?scp=85199082590&partnerID=8YFLogxK
U2 - 10.1109/JSEN.2024.3424218
DO - 10.1109/JSEN.2024.3424218
M3 - Article
AN - SCOPUS:85199082590
SN - 1530-437X
VL - 24
SP - 26407
EP - 26415
JO - IEEE Sensors Journal
JF - IEEE Sensors Journal
IS - 16
ER -