A Depth Estimation Method for Ground Moving Platforms via Detecting Region of Interest

Yifeng Xu, Yuanqing Xia*, Rui Hu, Wenjun Zhao, Jun Liao, Wei Gao

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Depth estimation is an essential part of decentralized coordinated control of multiple moving platforms, and many studies on depth reconstruction use machine learning methods to obtain depth information directly. However, the obtained target depth value has high uncertainty, which will lead to errors. This paper proposes a depth estimation algorithm for ground moving platforms, which can quickly estimate the relative position of its neighbor. The depth estimation algorithm consists of two parts. The detection part uses a deep convolutional neural network to extract the region of interest (ROI) while the depth recovery part estimates the depth value of the points obtaining from the feature extractor, which only processes the features in ROI. Then we feed 3D points into a depth optimizer to remove the outliers. Finally, the experiment results are presented to verify the effectiveness of our depth estimation algorithm.

Original languageEnglish
Title of host publicationProceeding - 2021 China Automation Congress, CAC 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages3537-3542
Number of pages6
ISBN (Electronic)9781665426473
DOIs
Publication statusPublished - 2021
Event2021 China Automation Congress, CAC 2021 - Beijing, China
Duration: 22 Oct 202124 Oct 2021

Publication series

NameProceeding - 2021 China Automation Congress, CAC 2021

Conference

Conference2021 China Automation Congress, CAC 2021
Country/TerritoryChina
CityBeijing
Period22/10/2124/10/21

Keywords

  • deep convolutional neural network
  • depth estimation
  • region of interest

Fingerprint

Dive into the research topics of 'A Depth Estimation Method for Ground Moving Platforms via Detecting Region of Interest'. Together they form a unique fingerprint.

Cite this