Hyperspectral image classification for mapping agricultural tillage practices

Qiong Ran*, Wei Li, Qian Du, Chenghai Yang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

17 Citations (Scopus)

Abstract

An efficient classification framework for mapping agricultural tillage practice using hyperspectral remote sensing imagery is proposed, which has the potential to be implemented practically to provide rapid, accurate, and objective surveying data for precision agricultural management and appraisal from large-scale remote sensing images. It includes a local region filter [i.e., Gaussian low-pass filter (GLF)] to extract spatial-spectral features, a dimensionality reduction process [i.e., local fisher's discriminate analysis (LFDA)], and the traditional k-nearest neighbor (KNN) classifier, and is denoted as GLF-LFDA-KNN. Compared to our previously used local average filter and adaptive weighted filter, the GLF also considers spatial features in a small neighborhood, but it emphasizes the central pixel itself and is data-independent; therefore, it can achieve the balance between classification accuracy and computational complexity. The KNN classifier has a lower computational complexity compared to the traditional support vector machine (SVM). After classification separability is enhanced by the GLF and LFDA, the less powerful KNN can outperform SVM and the overall computational cost remains lower. The proposed framework can also outperform the SVM with composite kernel (SVM-CK) that uses spatial-spectral features.

Original languageEnglish
Article number14722SS
JournalJournal of Applied Remote Sensing
Volume9
Issue number1
DOIs
Publication statusPublished - 2015
Externally publishedYes

Keywords

  • agricultural remote sensing
  • conservation tillage
  • feature extraction
  • hyperspectral data
  • spatial-spectral classification

Fingerprint

Dive into the research topics of 'Hyperspectral image classification for mapping agricultural tillage practices'. Together they form a unique fingerprint.

Cite this