SSVM: A smooth support vector machine for classification. Smoothing methods, extensively used for solving important mathematical programming problems and applications, are applied here to generate and solve an unconstrained smooth reformulation of the support vector machine for pattern classification using a completely arbitrary kernel. We term such reformulation a Smooth Support Vector Machine (SSVM). A fast Newton-Armijo algorithm for solving the SSVM converges globally and quadratically. Numerical results and comparisons are given to demonstrate the effectiveness and speed of the algorithm. On six publicly available datasets, tenfold cross validation correctness of SSVM was the highest compared with four other methods as well as the fastest. On larger problems, SSVM was comparable or faster than SVM light [T. Joachims, in: Advances in kernel methods – support vector learning, MIT Press: Cambridge, MA (1999)], SOR [O. L. Mangasarian and D. R. Musicant, IEEE Trans. Neural Networks 10, 1032-1037 (1999)] and SMO [J. Platt, in: Advances in kernel methods – support vector learning, MIT Press: Cambridge, MA (1999)]. SSVM can also generate a highly nonlinear separating surface, such as a checkerboard.

References in zbMATH (referenced in 64 articles , 1 standard article )

Showing results 1 to 20 of 64.
Sorted by year (citations)

1 2 3 4 next

  1. Gupta, Deepak; Richhariya, Bharat: Efficient implicit Lagrangian twin parametric insensitive support vector regression via unconstrained minimization problems (2021)
  2. Ni, Tie: Non-interior-point smoothing Newton method for CP revisited and its application to support vector machines (2020)
  3. Hien, Le Thi Khanh; Nguyen, Cuong V.; Xu, Huan; Lu, Canyi; Feng, Jiashi: Accelerated randomized mirror descent algorithms for composite non-strongly convex optimization (2019)
  4. Ketabchi, Saeed; Moosaei, Hossein; Razzaghi, Mohamad; Pardalos, Panos M.: An improvement on parametric (\nu)-support vector algorithm for classification (2019)
  5. Wang, Yidan; Yang, Liming; Yuan, Chao: A robust outlier control framework for classification designed with family of homotopy loss function (2019)
  6. Yin, Juan; Li, Qingna: A semismooth Newton method for support vector classification and regression (2019)
  7. Gu, Weizhe; Chen, Wei-Po; Ko, Chun-Hsu; Lee, Yuh-Jye; Chen, Jein-Shan: Two smooth support vector machines for (\varepsilon)-insensitive regression (2018)
  8. Khemchandani, Reshma; Saigal, Pooja; Chandra, Suresh: Angle-based twin support vector machine (2018)
  9. Piccialli, Veronica; Sciandrone, Marco: Nonlinear optimization and support vector machines (2018)
  10. Wang, Zhen; Shao, Yuan-Hai; Bai, Lan; Li, Chun-Na; Liu, Li-Ming; Deng, Nai-Yang: Insensitive stochastic gradient twin support vector machines for large scale problems (2018)
  11. Tanveer, M.: Linear programming twin support vector regression (2017)
  12. Tanveer, M.; Shubham, K.: Smooth twin support vector machines via unconstrained convex minimization (2017)
  13. Feng, Yunlong; Yang, Yuning; Huang, Xiaolin; Mehrkanoon, Siamak; Suykens, Johan A. K.: Robust support vector machines for classification with nonconvex and smooth losses (2016)
  14. Grigor’eva, Xeniya Vladimirovna: Approximate functions in a problem of sets separation (2016)
  15. Ni, Tie; Zhai, Jun: A matrix-free smoothing algorithm for large-scale support vector machines (2016)
  16. Yang, Xiaowei; Han, Le; Li, Yan; He, Lifang: A bilateral-truncated-loss based robust support vector machine for classification problems (2015)
  17. Balasundaram, S.; Gupta, Deepak; Kapil: Lagrangian support vector regression via unconstrained convex minimization (2014)
  18. Ferraro, Maria Brigida; Guarracino, Mario Rosario: From separating to proximal plane classifiers: a review (2014)
  19. Shabanzadeh, Parvaneh; Yusof, Rubiyah: A new method for solving supervised data classification problems (2014)
  20. Carrizosa, Emilio; Romero Morales, Dolores: Supervised classification and mathematical optimization (2013)

1 2 3 4 next