Generalized Maximum Likelihood Estimation for Perspective-n-Point Problem

Tian Zhan, Chunfeng Xu, Cheng Zhang*, Ke Zhu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

The Perspective-n-Point (PnP) problem has been widely studied in the literature and applied in various vision-based pose estimation scenarios. However, most existing methods ignore the anisotropy uncertainty of observations, as demonstrated in several real-world datasets in this letter. This oversight may lead to suboptimal and inaccurate estimation, particularly in the presence of noisy observations. To this end, we propose a generalized maximum likelihood PnP solver, named GMLPnP, that minimizes the determinant criterion by iterating the generalized least squares procedure to estimate the pose and uncertainty simultaneously. Further, the proposed method is decoupled from the camera model. Results of synthetic and real experiments show that our method achieves better accuracy in common pose estimation scenarios, GMLPnP improves rotation/translation accuracy by 4.7%/2.0% on TUM-RGBD and 18.6%/18.4% on KITTI-360 dataset compared to the best baseline. It is more accurate under very noisy observations in a vision-based UAV localization task, outperforming the best baseline by 34.4% in translation estimation accuracy.

Original languageEnglish
Pages (from-to)1752-1759
Number of pages8
JournalIEEE Robotics and Automation Letters
Volume10
Issue number2
DOIs
Publication statusPublished - 2025

Keywords

  • Localization
  • probability and statistical methods
  • vision-based navigation

Fingerprint

Dive into the research topics of 'Generalized Maximum Likelihood Estimation for Perspective-n-Point Problem'. Together they form a unique fingerprint.

Cite this