Abstract
The connections between Shannon mutual information, Kullback-Leibler divergence and Shannon inequality are investigated. Based on these connections and inequality theory, a new concept of generalized divergence is proposed, and a corresponding definition is given. Thus a class of novel similarity measures for multimodal image registration is put forward, such as algorithm-geometry mean divergence, Cauchy-Schwartz divergence and Minkowski generalized divergence. The novel measures are applied to rigid registration of positron emission tomography (PET) magnetic resonance (MR) image pairs. Their performance is compared with mutual information as to the consumed time, sensitivity to noise and effects of different image window size. The results of tests indicate that some of the novel similarity functions can yield better performance and demonstrate that the proposed methods are efficient and effective.
Original language | English |
---|---|
Pages (from-to) | 156-161+184 |
Journal | Beijing Ligong Daxue Xuebao/Transaction of Beijing Institute of Technology |
Volume | 26 |
Issue number | 2 |
Publication status | Published - Feb 2006 |
Keywords
- Generalized divergence measures
- Inequality theory
- Kullback-Leibler divergence
- Multimodality image registration
- Registration measure