DCA for Gaussian Kernel Support Vector Machines with Feature Selection
Abstract
We consider the support vector machines problem with the feature selection using Gaussian kernel function. This problem takes the form of a nonconvex minimization problem with binary variables. We investigate an exact penalty technique to deal with the binary variables. The resulting optimization problem can be expressed as a DC (Difference of Convex functions) program on which DCA (DC Algorithm) is applied. Numerical experiments on four benchmark real datasets show the efficiency of the proposed algorithm in terms of both feature selection and classification when compared with the existing algorithm.