Research article Special Issues

Adaptive robust AdaBoost-based kernel-free quadratic surface support vector machine with Universum data

  • Published: 07 April 2025
  • MSC : 68T10, 91C20

  • In this paper, we proposed a novel binary classification framework named adaptive robust AdaBoost-based kernel-free quadratic surface support vector machine with Universum data (A-R-U-SQSSVM). First, we developed R-U-SQSSVM by integrating the capped $ L_{2, p} $-norm distance metric and the generalized Welsch adaptive loss function to improve the model's robustness and adaptability. Furthermore, we introduced Universum data points into R-U-SQSSVM to enhance the model's generalization performance by incorporating valuable prior knowledge for the classifier. Additionally, we utilized R-U-SQSSVM as a weak classifier and embedded the AdaBoost algorithm within it to obtain a strong classifier, A-R-U-SQSSVM. To effectively solve our model, we transformed it into a quadratic programming problem using the half-quadratic (HQ) optimization algorithm and concave duality. This transformed problem can be solved using convex optimization methods, such as the sequential minimal optimization (SMO) algorithm. Experimental results on University of California, Irvine (UCI) datasets demonstrated the superior classification performance of our method. In large datasets, A-R-U-SQSSVM was hundreds or even a thousand times faster than traditional capped twin support vector machine (CTSVM), SQSSVM.

    Citation: Bao Ma, Yanrong Ma, Jun Ma. Adaptive robust AdaBoost-based kernel-free quadratic surface support vector machine with Universum data[J]. AIMS Mathematics, 2025, 10(4): 8036-8065. doi: 10.3934/math.2025369

    Related Papers:

  • In this paper, we proposed a novel binary classification framework named adaptive robust AdaBoost-based kernel-free quadratic surface support vector machine with Universum data (A-R-U-SQSSVM). First, we developed R-U-SQSSVM by integrating the capped $ L_{2, p} $-norm distance metric and the generalized Welsch adaptive loss function to improve the model's robustness and adaptability. Furthermore, we introduced Universum data points into R-U-SQSSVM to enhance the model's generalization performance by incorporating valuable prior knowledge for the classifier. Additionally, we utilized R-U-SQSSVM as a weak classifier and embedded the AdaBoost algorithm within it to obtain a strong classifier, A-R-U-SQSSVM. To effectively solve our model, we transformed it into a quadratic programming problem using the half-quadratic (HQ) optimization algorithm and concave duality. This transformed problem can be solved using convex optimization methods, such as the sequential minimal optimization (SMO) algorithm. Experimental results on University of California, Irvine (UCI) datasets demonstrated the superior classification performance of our method. In large datasets, A-R-U-SQSSVM was hundreds or even a thousand times faster than traditional capped twin support vector machine (CTSVM), SQSSVM.



    加载中


    [1] Y. Bhattarai, S. Duwal, S. Sharma, R. Talchabhadel, Leveraging machine learning and open-source spatial datasets to enhance flood susceptibility mapping in transboundary river basin, Int. J. Digit. Earth, 17 (2024), 2313857. https://doi.org/10.1080/17538947.2024.2313857 doi: 10.1080/17538947.2024.2313857
    [2] W. Liu, D. Tao, Multiview hessian regularization for image annotation, IEEE T. Image Process., 22 (2013), 2676–2687. https://doi.org/10.1109/TIP.2013.2255302 doi: 10.1109/TIP.2013.2255302
    [3] Q. Yuan, H. Shen, T. Li, Z. Li, S. Li, Y. Jiang, et al., Deep learning in environmental remote sensing: Achievements and challenges, Remote Sens. Environ., 241 (2020), 111716. https://doi.org/10.1016/j.rse.2020.111716 doi: 10.1016/j.rse.2020.111716
    [4] C. Cortes, V. Vapnik, Support-vector networks, Mach. Learn., 20 (1995), 273–297. https://doi.org/10.1007/BF00994018 doi: 10.1007/BF00994018
    [5] J. Platt, Sequential minimal optimization: A fast algorithm for training support vector machines, Published by Microsoft, 1998.
    [6] R. E. Fan, P. H. Chen, C. J. Lin, T. Joachims, Working set selection using second order information for training support vector machines, J. Mach. Learn. Res., 6 (2005).
    [7] Y. Wu, Y. Liu, Robust truncated hinge loss support vector machines, J. Am. Stat. Assoc., 102 (2007), 974–983. https://doi.org/10.1198/016214507000000617 doi: 10.1198/016214507000000617
    [8] L. Wang, H. Jia, J. Li, Training robust support vector machine with smooth ramp loss in the primal space, Neurocomputing, 71 (2008), 3020–3025. https://doi.org/10.1016/j.neucom.2007.12.032 doi: 10.1016/j.neucom.2007.12.032
    [9] X. Huang, L. Shi, J. A. Suykens, Support vector machine classifier with pinball loss, IEEE T. Pattern Anal., 36 (2013), 984–997. https://doi.org/10.1109/TPAMI.2013.178 doi: 10.1109/TPAMI.2013.178
    [10] X. Shen, L. Niu, Z. Qi, Y. Tian, Support vector machine classifier with truncated pinball loss, Pattern Recogn., 68 (2017), 199–210. https://doi.org/10.1016/j.patcog.2017.03.011 doi: 10.1016/j.patcog.2017.03.011
    [11] C. Yuan, L. Yang, Capped $L_{2, P}$-norm metric based robust least squares twin support vector machine for pattern classification, Neural Networks, 142 (2021), 457–478. https://doi.org/10.1016/j.neunet.2021.06.028 doi: 10.1016/j.neunet.2021.06.028
    [12] J. Zhang, Z. Lai, H. Kong, L. Shen, Robust twin bounded support vector classifier with manifold regularization, IEEE T. Cybernetics, 53 (2022), 5135–5150. https://doi.org/10.1109/TCYB.2022.3160013 doi: 10.1109/TCYB.2022.3160013
    [13] J. Ke, C. Gong, T. Liu, L. Zhao, J. Yang, D. Tao, Laplacian Welsch regularization for robust semisupervised learning, IEEE T. Cybernetics, 52 (2020), 164–177. https://doi.org/10.1109/TCYB.2019.2953337 doi: 10.1109/TCYB.2019.2953337
    [14] J. Ma, L. Yang, Q. Sun, Capped $L_{1}$-norm distance metric-based fast robust twin bounded support vector machine, Neurocomputing, 412 (2020), 295–311. https://doi.org/10.1016/j.neucom.2020.06.053 doi: 10.1016/j.neucom.2020.06.053
    [15] Z. Y. Wang, H. C. So, A. M. Zoubir, Low-rank tensor completion via novel sparsity-inducing regularizers, IEEE T. Signal Process., 72 (2024), 3519–3534. https://doi.org/10.1109/TSP.2024.3424272 doi: 10.1109/TSP.2024.3424272
    [16] Z. Y. Wang, H. C. So, Robust matrix completion via novel M-estimator functions, In: 2024 International Conference on Electrical, Computer and Energy Technologies (ICECET), Australia: IEEE, 2024. https://doi.org/10.1109/ICECET61485.2024.10698047
    [17] Z. Y. Wang, H. C. So, A. M. Zoubir, Robust low-rank matrix recovery via hybrid ordinary-Welsch function, IEEE T. Signal Process., 71 (2023), 2548–2563. https://doi.org/10.1109/TSP.2023.3290353 doi: 10.1109/TSP.2023.3290353
    [18] Z. Y. Wang, X. P. Li, H. C. So, Robust matrix completion based on factorization and truncated-quadratic loss function, IEEE T. Circ. Syst. Vid., 33 (2022), 1521–1534. https://doi.org/10.1109/TCSVT.2022.3214583 doi: 10.1109/TCSVT.2022.3214583
    [19] A. Eslam, M. G. Abdelfattah, E. S. M. El-Kenawy, H. El-Din Moustafa, Classification of monkeypox using Greylag Goose Optimization (GGO) algorithm, Fusion Pract. Appl., 16 (2024).
    [20] E. S. M. El-Kenawy, F. H. Rizk, A. M. Zaki, M. E. Mohamed, A. Ibrahim, A. A. Abdelhamid, et al., Football optimization algorithm (fboa): A novel metaheuristic inspired by team strategy dynamics, J. Artif. Intell. Metaheuristics, 8 (2024), 21–38. https://doi.org/10.54216/JAIM.080103 doi: 10.54216/JAIM.080103
    [21] V. Vijayalakshmi, M. S. Babu, R. P. Lakshmi, Kfcm algorithm for effective brain stroke detection through SVM classifier, In: 2018 IEEE International Conference on System, Computation, Automation and Networking (ICSCA), India: IEEE, 2018. https://doi.org/10.1109/ICSCAN.2018.8541179
    [22] A. Kumar, G. Sharma, R. Pareek, S. Sharma, P. Dadheech, M. K. Gupta, Performance optimisation of face recognition based on LBP with SVM and random forest classifier, Int. J. Biometrics, 15 (2023), 389–408. https://doi.org/10.1504/IJBM.2023.130644 doi: 10.1504/IJBM.2023.130644
    [23] V. Vapnik, Estimation of dependences based on empirical data, Springer Science and Business Media, 2006.
    [24] J. Weston, R. Collobert, F. Sinz, L. Bottou, V. Vapnik, Inference with the universum, In: Proceedings of the 23rd international conference on Machine learning, 2006, 1009–1016. https://doi.org/10.1145/1143844.1143971
    [25] C. Shen, P. Wang, F. Shen, H. Wang, U Boost: Boosting with the Universum, IEEE T. Pattern Anal., 34 (2011), 825–832. https://doi.org/10.1109/TPAMI.2011.240 doi: 10.1109/TPAMI.2011.240
    [26] X. Yan, H. Zhu, A kernel-free fuzzy support vector machine with Universum, J. Ind. Manag. Optim., 19 (2023). https://doi.org/10.3934/jimo.2021184
    [27] H. Moosaei, A. Mousavi, M. Hladík, Z. Gao, Sparse $L_{1}$-norm quadratic surface support vector machine with Universum data, Soft Comput., 27 (2023), 5567–5586. https://doi.org/10.1007/s00500-023-07860-3 doi: 10.1007/s00500-023-07860-3
    [28] X. Bai, V. Cherkassky, Gender classification of human faces using inference through contradictions, In: 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence), Hong Kong: IEEE, 2008,746–750. https://doi.org/10.1109/IJCNN.2008.4633879
    [29] C. L. Liu, W. H. Hsaio, C. H. Lee, T. H. Chang, T. H. Kuo, Semi-supervised text classification with universum learning, IEEE T. Cybernetics, 46 (2015), 462–473. https://doi.org/10.1109/TCYB.2015.2403573 doi: 10.1109/TCYB.2015.2403573
    [30] B. Richhariya, M. Tanveer, A. H. Rashid, Alzheimer's disease neuroimaging initiative, diagnosis of Alzheimer's disease using universum support vector machine based recursive feature elimination (USVM-RFE), Biomed. Signal Proces., 59 (2020), 101903. https://doi.org/10.1016/j.bspc.2020.101903 doi: 10.1016/j.bspc.2020.101903
    [31] I. Dagher, Quadratic kernel-free non-linear support vector machine, J. Global Optim., 41 (2008), 15–30. https://doi.org/10.1007/s10898-007-9162-0 doi: 10.1007/s10898-007-9162-0
    [32] J. Luo, S. C. Fang, Z. Deng, X. Guo, Soft quadratic surface support vector machine for binary classification, Asia Pac. J. Oper. Res., 33 (2016), 1650046. https://doi.org/10.1142/S0217595916500469 doi: 10.1142/S0217595916500469
    [33] Y. Bai, X. Han, T. Chen, H. Yu, Quadratic kernel-free least squares support vector machine for target diseases classification, J. Comb. Optim., 30 (2015), 850–870. https://doi.org/10.1007/s10878-015-9848-z doi: 10.1007/s10878-015-9848-z
    [34] X. Yan, H. Zhu, A novel robust support vector machine classifier with feature mapping, Knowl.-Based Syst., 257 (2022), 109928. https://doi.org/10.1016/j.knosys.2022.109928 doi: 10.1016/j.knosys.2022.109928
    [35] Z. Gao, S. C. Fang, J. Luo, N. Medhin, A kernel-free double well potential support vector machine with applications, Eur. J. Oper. Res., 290 (2021), 248–262. https://doi.org/10.1016/j.ejor.2020.10.040 doi: 10.1016/j.ejor.2020.10.040
    [36] Z. Y. Chen, Z. P. Fan, M. Sun, A multi-kernel support tensor machine for classification with multitype multiway data and an application to cross-selling recommendations, Eur. J. Oper. Res., 255 (2016), 110–120. https://doi.org/10.1016/j.ejor.2016.05.020 doi: 10.1016/j.ejor.2016.05.020
    [37] J. Zhou, Y. Tian, J. Luo, Q. Zhai, A kernel-free Laplacian quadratic surface optimal margin distribution machine with application to credit risk assessment, Appl. Soft Comput., 133 (2023), 109931. https://doi.org/10.1016/j.asoc.2022.109931 doi: 10.1016/j.asoc.2022.109931
    [38] Z. Gao, Y. Wang, M. Huang, J. Luo, S. Tang, A kernel-free fuzzy reduced quadratic surface v-support vector machine with applications, Appl. Soft Comput., 127 (2022), 109390. https://doi.org/10.1016/j.asoc.2022.109390 doi: 10.1016/j.asoc.2022.109390
    [39] J. R. Magnus, H. Neudecker, The elimination matrix: Some lemmas and applications, SIAM J. Algebr. Discrete Meth., 1 (1980), 422–449. https://doi.org/10.1137/0601049 doi: 10.1137/0601049
    [40] Z. Qi, Y. Tian, Y. Shi, Twin support vector machine with universum data, Neural Networks, 36 (2012), 112–119. https://doi.org/10.1016/j.neunet.2012.09.004 doi: 10.1016/j.neunet.2012.09.004
    [41] S. Dhar, V. Cherkassky, Cost-sensitive Universum-svm, In: 2012 11th International Conference on Machine Learning and Applications, USA: IEEE, 2012. https://doi.org/10.1109/ICMLA.2012.45
    [42] V. Cherkassky, F. M. Mulier, Learning from data: Concepts, theory, and methods, John Wiley and Sons, 2007. https://doi.org/10.1002/9780470140529
    [43] R. He, B. Hu, X. Yuan, L. Wang, M-estimators and half-quadratic minimization, Robust Recognition via Information Theoretic Learning, Springer, Cham, 2014, 3–11. https://doi.org/10.1007/978-3-319-07416-0_2
    [44] B. Ma, J. Ma, G. Yu, A novel robust metric distance optimization-driven manifold learning framework for semi-supervised pattern classification, Axioms, 12 (2023), 737. https://doi.org/10.3390/axioms12080737 doi: 10.3390/axioms12080737
    [45] T. Zhang, Analysis of multi-stage convex relaxation for sparse regularization, J. Mach. Learn. Res., 11 (2010).
    [46] R. E. Schapire, The strength of weak learnability, Mach. Learn., 5 (1990), 197–227. https://doi.org/10.1007/BF00116037 doi: 10.1007/BF00116037
    [47] Y. Freund, R. E. Schapire, A desicion-theoretic generalization of on-line learning and an application to boosting, In: European conference on computational learning theory, Berlin, Heidelberg: Springer, 1995, 23–37. https://doi.org/10.1007/3-540-59119-2_166
    [48] J. Wang, P. Li, R. Ran, Y. Che, Y. Zhou, A short-term photovoltaic power prediction model based on the gradient boost decision tree, Appl. Sci., 8 (2018), 689. https://doi.org/10.3390/app8050689 doi: 10.3390/app8050689
    [49] Z. Chen, F. Jiang, Y. Cheng, X. Gu, W. Liu, J. Peng, XGBoost classifier for DDoS attack detection and analysis in SDN-based cloud, In: 2018 IEEE international conference on big data and smart computing (bigcomp), IEEE, 2018,251–256.
    [50] J. Xu, Q. Wu, J. Zhang, Z. Tang, Exploiting Universum data in AdaBoost using gradient descent, Image Vision Comput., 32 (2014), 550–557. https://doi.org/10.1016/j.imavis.2014.04.009 doi: 10.1016/j.imavis.2014.04.009
    [51] C. L. Liu, W. H. Hsaio, C. H. Lee, T. H. Chang, T. H. Kuo, Semi-supervised text classification with universum learning, IEEE T. Cybernetics, 46 (2015), 462–473. https://doi.org/10.1109/TCYB.2015.2403573 doi: 10.1109/TCYB.2015.2403573
    [52] B. Liu, R. Huang, Y. Xiao, J. Liu, K. Wang, L. Li, et al., Adaptive robust Adaboost-based twin support vector machine with universum data, Inform. Sciences, 609 (2022), 1334–1352. https://doi.org/10.1016/j.ins.2022.07.155 doi: 10.1016/j.ins.2022.07.155
    [53] J. Ma, G. Yu, W. Xiong, X. Zhu, Safe semi-supervised learning for pattern classification, Eng. Appl. Artif. Intel., 121 (2023), 106021. https://doi.org/10.1016/j.engappai.2023.106021 doi: 10.1016/j.engappai.2023.106021
    [54] L. Li, Q. Hu, X. Wu, D. Yu, Exploration of classification confidence in ensemble learning, Pattern Recogn., 47 (2014), 3120–3131. https://doi.org/10.1016/j.patcog.2014.03.021 doi: 10.1016/j.patcog.2014.03.021
    [55] I. E. Livieris, L. Iliadis, P. Pintelas, On ensemble techniques of weight-constrained neural networks, Evol. Syst., 12 (2021), 155–167. https://doi.org/10.1007/s12530-019-09324-2 doi: 10.1007/s12530-019-09324-2
    [56] J. Demi$\check{s}$ar, Statistical comparisons of classifiers over multiple data sets, J. Mach. Learn. Res., 7 (2006), 1–30.
  • Reader Comments
  • © 2025 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(852) PDF downloads(59) Cited by(0)

Article outline

Figures and Tables

Figures(5)  /  Tables(8)

Other Articles By Authors

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog