Joint infrared target recognition and segmentation using a shape manifold-aware level set

Liangjiang Yu, Guoliang Fan*, Jiulu Gong, Joseph P. Havlicek

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

12 Citations (Scopus)

Abstract

We propose new techniques for joint recognition, segmentation and pose estimation of infrared (IR) targets. The problem is formulated in a probabilistic level set framework where a shape constrained generative model is used to provide a multi-class and multi-view shape prior and where the shape model involves a couplet of view and identity manifolds (CVIM). A level set energy function is then iteratively optimized under the shape constraints provided by the CVIM. Since both the view and identity variables are expressed explicitly in the objective function, this approach naturally accomplishes recognition, segmentation and pose estimation as joint products of the optimization process. For realistic target chips, we solve the resulting multi-modal optimization problem by adopting a particle swarm optimization (PSO) algorithm and then improve the computational efficiency by implementing a gradient-boosted PSO (GB-PSO). Evaluation was performed using the Military Sensing Information Analysis Center (SENSIAC) ATR database, and experimental results show that both of the PSO algorithms reduce the cost of shape matching during CVIM-based shape inference. Particularly, GB-PSO outperforms other recent ATR algorithms, which require intensive shape matching, either explicitly (with pre-segmentation) or implicitly (without pre-segmentation).

Original languageEnglish
Pages (from-to)10118-10145
Number of pages28
JournalSensors
Volume15
Issue number5
DOIs
Publication statusPublished - 2015

Keywords

  • Infrared ATR
  • Level set
  • Particle swarm optimization
  • Shape modeling

Fingerprint

Dive into the research topics of 'Joint infrared target recognition and segmentation using a shape manifold-aware level set'. Together they form a unique fingerprint.

Cite this