Novel generalized divergence measures and multimodal medical image registration

Yong Gang Shi*, Mou Yan Zou

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

The connections between Shannon mutual information, Kullback-Leibler divergence and Shannon inequality are investigated. Based on these connections and inequality theory, a new concept of generalized divergence is proposed, and a corresponding definition is given. Thus a class of novel similarity measures for multimodal image registration is put forward, such as algorithm-geometry mean divergence, Cauchy-Schwartz divergence and Minkowski generalized divergence. The novel measures are applied to rigid registration of positron emission tomography (PET) magnetic resonance (MR) image pairs. Their performance is compared with mutual information as to the consumed time, sensitivity to noise and effects of different image window size. The results of tests indicate that some of the novel similarity functions can yield better performance and demonstrate that the proposed methods are efficient and effective.

Original languageEnglish
Pages (from-to)156-161+184
JournalBeijing Ligong Daxue Xuebao/Transaction of Beijing Institute of Technology
Volume26
Issue number2
Publication statusPublished - Feb 2006

Keywords

  • Generalized divergence measures
  • Inequality theory
  • Kullback-Leibler divergence
  • Multimodality image registration
  • Registration measure

Fingerprint

Dive into the research topics of 'Novel generalized divergence measures and multimodal medical image registration'. Together they form a unique fingerprint.

Cite this