"Mixed-Integer Support Vector Machine"
W. Guan, A. Gray, and S. Leyffer
Preprint Version: [pdf]
In this paper, we propose a formulation of a feature selecting support vector machine based on the L0-norm. We explore a perspective relaxation of the optimization problem and solve it using mixed-integer nonlinear programming (MINLP) techniques. Given a training set of labeled data instances, we construct a maxmargin classifier that minimizes the hinge loss as well as the cardinality of the
weight vector of the separating hyperplane ||w||[sub 0], effectively performing feature selection and classification simultaneously in one optimization. We compare this relaxation with the standard SVM, recursive feature elimination (RFE), L1-norm SVM, and two approximated L0-norm SVM methods, and show promising results on real-world datasets in terms of accuracy and sparsity.