Fast nonparallel support vector machine with the margin hyper-planes and its iterative solver
-
Graphical Abstract
-
Abstract
Nonparallel support vector machine (NPSVM) combines the advantages of SVM and twin SVM, excelling in small-scale data classification. However, its inability to leverage the structural distribution of samples limits its generalization capability. NPSVM requires solving a pair of quadratic programming problems with inequality constraints, thereby reducing learning efficiency. To handle these drawbacks, we propose a fast NPSVM model with the margin hyper-planes (MH-fNPSVM), which introduces several key innovations. Firstly, by replacing inequality constraints with equality constraints, MH-fNPSVM transforms the optimization problem into solving a pair of linear equations, significantly improving computational efficiency. Secondly, MH-fNPSVM also incorporates margin distribution by optimizing the first and second-order statistics of the training samples, improving the generalization performance. Furthermore, MH-fNPSVM transforms the slack variables from 1-norm to 2-norm by using a quadratic loss function, which overcomes the non-smoothness of the original loss function in NPSVM and enables the model to effectively fit the trends in data distribution, enhancing the robustness and generalization ability. Lastly, an iterative conjugate gradient method is designed for MH-fNPSVM to avoid kernel matrix inversion, thereby ensuring both accuracy and scalability. The model was validated on different data-sets and demonstrated excellent performance in generalization performance and runtime compared to baseline model.
-
-