Multi-hypothesis nearest-neighbor classifier based on class-conditional weighted distance metric

Lianmeng Jiao, Quan Pan*, Xiaoxue Feng

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

10 Citations (Scopus)

Abstract

The performance of nearest-neighbor (NN) classifiers is known to be very sensitive to the distance metric used in classifying a query pattern, especially in scarce-prototype cases. In this paper, a class-conditional weighted (CCW) distance metric related to both the class labels of the prototypes and the query patterns is proposed. Compared with the existing distance metrics, the proposed metric provides more flexibility to design the feature weights so that the local specifics in feature space can be well characterized. Based on the proposed CCW distance metric, a multi-hypothesis nearest-neighbor (MHNN) classifier is developed. The scheme of the proposed MHNN classifier is to classify the query pattern under multiple hypotheses in which the nearest-neighbor sub-classifiers can be implemented based on the CCW distance metric. Then the classification results of multiple sub-classifiers are combined to get the final result. Under this general scheme, a specific realization of the MHNN classifier is developed within the framework of Dempster-Shafer theory due to its good capability of representing and combining uncertain information. Two experiments based on synthetic and real data sets were carried out to show the effectiveness of the proposed technique.

Original languageEnglish
Pages (from-to)1468-1476
Number of pages9
JournalNeurocomputing
Volume151
Issue numberP3
DOIs
Publication statusPublished - 3 Mar 2015
Externally publishedYes

Keywords

  • Dempster-Shafer theory
  • Multi-hypothesis nearest-neighbor classifier
  • Pattern classification
  • Weighted distance metric

Fingerprint

Dive into the research topics of 'Multi-hypothesis nearest-neighbor classifier based on class-conditional weighted distance metric'. Together they form a unique fingerprint.

Cite this