Abstract
Dimension reduction is frequently adopted as a data preprocessing technique to facilitate data visualization, interpretation, and classification. Traditional dimension reduction methods such as linear discriminant analysis focus on maximizing the overall discrimination between all classes, which may be easily affected by outliers. To overcome this disadvantage, this paper proposes a novel method for multiclass dimension reduction, named dimension reduction by minimum error minimax probability machine (DR-MEMPM). It elaborately ensures that each pair of classes is well separated in the projected subspace by utilizing the separation probability between different pairwise classes. Therefore, it can put more emphasis on those less distinguishable classes, and the learned projection will not be dominated by some 'outlier' classes which lie far away from other classes. We evaluate the proposed DR-MEMPM on a number of synthetic and real-world data sets, and show that it outperforms other state-of-The-Art dimension reduction methods in terms of visual intuition and classification accuracy, especially when the distances between classes are unevenly distributed.
Original language | English |
---|---|
Article number | 7478669 |
Pages (from-to) | 58-69 |
Number of pages | 12 |
Journal | IEEE Transactions on Systems, Man, and Cybernetics: Systems |
Volume | 47 |
Issue number | 1 |
DOIs | |
Publication status | Published - Jan 2017 |
Externally published | Yes |
Keywords
- Data handling
- feature extraction
- minimum error minimax probability machine (MEMPM)
- multiclass
- optimization methods
- probability
- supervised dimension reduction