Abstract
Model structure selection is currently an open problem in modeling data via Gaussian Mixture Models (GMM). This paper proposes a discriminative method to select GMM structures for pattern classification. We introduce a GMM structure selection criterion based on a discriminative objective function called Soft target based Max-Min posterior Pseudo-probabilities (Soft-MMP). The structure and the parameters of the optimal GMM are estimated simultaneously by seeking the maximum value of Laplace's approximation of the integrated Soft-MMP function. The line search algorithm is employed to solve this optimization problem. We evaluate the proposed GMM structure selection method through the experiments of handwritten digit recognition on the well-known CENPARMI and MNIST digit databases. Our method behaves better than the manual method and the generative counterparts, including Bayesian Information Criterion (BIC), Minimum Description Length (MDL) and AutoClass. Furthermore, to our best knowledge, the digit classifier trained by using our method achieves the best error rate so far on the CENPARMI database and the error rate comparable to the currently best ones on the MNIST database.
Original language | English |
---|---|
Pages (from-to) | 954-961 |
Number of pages | 8 |
Journal | Neurocomputing |
Volume | 74 |
Issue number | 6 |
DOIs | |
Publication status | Published - 15 Feb 2011 |
Keywords
- Discriminative learning
- Finite Mixture Models (FMM)
- Gaussian Mixture Models (GMM)
- Max-Min posterior Pseudo-probabilities (MMP)
- Parameter estimation
- Structure selection