The standard 2-norm support vector machine (SVM for short) is known for its good performance in classification and regression problems. In this paper, the L∞ norm support vector machine is considered and a novel smoothing function method is proposed in an attempt to overcome some drawbacks of the former methods which are complex, subtle, and sometimes difficult to implement. Based on Karush-Kuhn-Tucker complementarity condition in optimization theory, unconstrained non-differentiable optimization model is built, and an approximate algorithm is presented. we take advantage of approximate smooth function and a Newton-Armijo algorithm is given to solve the corresponding optimization using difference convex algorithm. The paper trains the data sets with standard unconstraint optimization method. This algorithm is fast and insensitive to the initial point. Theory analysis and numerical results illustrate that the smoothing function method for the L∞ norm SVM is feasible and effective.
展开▼