Autonomous navigation algorithm for precision landing based on computer vision

Yang Tian*, Pingyuan Cui, Hutao Cui

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Citations (Scopus)

Abstract

In this paper we propose a visual algorithm for use by a deep space exploration spacecraft to estimate the relative position and attitude on broad during the descent phase. This algorithm is composed of the relative motion recovery which provides part motion states estimates based on tracking feature through the monocular image sequence, and landmark recognition based algorithm which offers the scale of the relative motion and absolute position of spacecraft. The results on synthetic image show that the proposed algorithm can provide the estimation of state with satisfactory accuracy.

Original languageEnglish
Title of host publicationInternational Symposium on Photoelectronic Detection and Imaging 2009 - Advances in Imaging Detectors and Applications
DOIs
Publication statusPublished - 2009
Externally publishedYes
EventInternational Symposium on Photoelectronic Detection and Imaging 2009: Advances in Imaging Detectors and Applications - Beijing, China
Duration: 17 Jun 200919 Jun 2009

Publication series

NameProceedings of SPIE - The International Society for Optical Engineering
Volume7384
ISSN (Print)0277-786X

Conference

ConferenceInternational Symposium on Photoelectronic Detection and Imaging 2009: Advances in Imaging Detectors and Applications
Country/TerritoryChina
CityBeijing
Period17/06/0919/06/09

Keywords

  • Autonomous navigation
  • Computer vision
  • Crater recognition
  • Landing
  • Recursive estimation

Fingerprint

Dive into the research topics of 'Autonomous navigation algorithm for precision landing based on computer vision'. Together they form a unique fingerprint.

Cite this