NSVMOOP
Nonparallel support vector machine based on one optimization problem for pattern recognition. In this paper, we present a novel nonparallel support vector machine based on one optimization problem (NSVMOOP) for binary classification. Our NSVMOOP is formulated aiming to separate classes from the largest possible angle between the normal vectors and the decision hyperplanes in the feature space, at the same time implementing the structural risk minimization principle. Different from other nonparallel classifiers, such as the representative twin support vector machine, it constructs two nonparallel hyperplanes simultaneously by solving a single quadratic programming problem, on which a modified sequential minimization optimization algorithm is explored. The NSVMOOP is analyzed theoretically and implemented experimentally. Experimental results on both artificial and publicly available benchmark datasets show its feasibility and effectiveness.
Keywords for this software
References in zbMATH (referenced in 4 articles , 1 standard article )
Showing results 1 to 4 of 4.
Sorted by year (- Zhou, Jia-Bin; Bai, Yan-Qin; Guo, Yan-Ru; Lin, Hai-Xiang: Intuitionistic fuzzy Laplacian twin support vector machine for semi-supervised classification (2022)
- Gao, Qian-Qian; Bai, Yan-Qin; Zhan, Ya-Ru: Quadratic kernel-free least square twin support vector machine for binary classification problems (2019)
- Khemchandani, Reshma; Saigal, Pooja; Chandra, Suresh: Angle-based twin support vector machine (2018)
- Tian, Ying-Jie; Ju, Xu-Chan: Nonparallel support vector machine based on one optimization problem for pattern recognition (2015)