Abstract
Since labeling all the samples by the user is time-consuming and fastidious, we often obtain a large amount of unlabeled examples and only a small number of labeled examples in classification. In this context, the classification is called semi-supervised learning. In this paper, we propose a novel semi-supervised learning methodology, named Laplacian mixed-norm proximal support vector machine Lap-MNPSVM for short. In the optimization problem of Lap-MNPSVM, the information from the unlabeled examples is used in a form of Laplace regularization, and lp norm (0<p<1) regularizer is introduced to standard proximal support vector machine to control sparsity and the feature selection. To solve the nonconvex optimization problem in Lap-MNPSVM, an efficient algorithm is proposed by solving a series systems of linear equations, and the lower bounds of the solution are established, which are extremely helpful for feature selection. Experiments carried out on synthetic datasets and the real-world datasets show the feasibility and effectiveness of the proposed method.
Original language | English |
---|---|
Pages (from-to) | 399-407 |
Number of pages | 9 |
Journal | Neural Computing and Applications |
Volume | 26 |
Issue number | 2 |
DOIs | |
Publication status | Published - Feb 2014 |
Keywords
- Manifold regularization
- Mixed-norm
- Proximal support vector machine
- Semi-supervised classification
- Sparsity