A multi-modal pattern classification framework for hyperspectral image analysis

Wei Li*, Saurabh Prasad, James E. Fowler, Lori M. Bruce

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

2 Citations (Scopus)

Abstract

Dimensionality reduction is a crucial preprocessing step for effective analysis of high dimensional hyperspectral imagery (HSI). Currently popular dimensionality reduction techniques (such as Principal Component Analysis, Linear Discriminant Analysis and their many variants) assume that the data are Gaussian distributed. The quadratic maximum likelihood classifier commonly employed for HSI analysis also assumes Gaussian class-conditional distributions. In this paper, we propose a classification paradigm that is designed to exploit the rich statistical structure of hyperspectral data. It does not make the Gaussian assumption, and performs effective dimensionality reduction and classification of highly non-Gaussian, multi-modal HSI data. The framework employs Local Fisher's Discriminant Analysis (LFDA) to reduce the dimensionality of the data while preserving its multi-modal structure. This is followed by a Gaussian Mixture Model (GMM) classifier for effective classification of the reduced dimensional multi-modal data. Experimental results on a multi-class HSI classification task show that the proposed approach significantly outperforms conventional approaches.

Original languageEnglish
Article number6080894
JournalWorkshop on Hyperspectral Image and Signal Processing, Evolution in Remote Sensing
DOIs
Publication statusPublished - 2011
Externally publishedYes
Event3rd Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing, WHISPERS 2011 - Lisbon, Portugal
Duration: 6 Jun 20119 Jun 2011

Keywords

  • Dimensionality reduction
  • Gaussian mixture model
  • Hyperspectral data

Fingerprint

Dive into the research topics of 'A multi-modal pattern classification framework for hyperspectral image analysis'. Together they form a unique fingerprint.

Cite this