This paper introduces a novel support vector regression framework with an orthogonal constraint (OC-SVR) for outlier suppression. The proposed method detects outliers via angular deviations in the feature space and modifies the support vector regression (SVR) optimization problem by enforcing orthogonality between the regression function and the detected outliers' feature directions, thereby neutralizing their influence. Experimental evaluations on benchmark datasets demonstrate that OC-SVR achieves significantly improved robustness and predictive accuracy compared to standard SVR and robust SVR variants.
Citation: Felix Ndudim, Thanasak Mouktonglang. Support vector regression with orthogonal constraints for outlier suppression[J]. AIMS Mathematics, 2026, 11(1): 462-482. doi: 10.3934/math.2026020
This paper introduces a novel support vector regression framework with an orthogonal constraint (OC-SVR) for outlier suppression. The proposed method detects outliers via angular deviations in the feature space and modifies the support vector regression (SVR) optimization problem by enforcing orthogonality between the regression function and the detected outliers' feature directions, thereby neutralizing their influence. Experimental evaluations on benchmark datasets demonstrate that OC-SVR achieves significantly improved robustness and predictive accuracy compared to standard SVR and robust SVR variants.
| [1] | M. Akhtar, M. Tanveer, M. Arshad, Hawkeye: Advancing robust regression with bounded, smooth, and insensitive loss function, preprint paper, 2024. https://doi.org/10.48550/arXiv.2401.16785 |
| [2] | N. Cristianini, J. Shawe Taylor, An Introduction to Support Vector Machines and Other Kernel-based Learning Methods, Cambridge: Cambridge University Press, 2000. |
| [3] |
G. De Brabanter, B. De Moor, J. A. K. Suykens, Approximate confidence and prediction intervals for least squares support vector regression, IEEE Trans. Neural Networks, 22 (2011), 110–120. https://doi.org/10.1109/TNN.2010.2087769 doi: 10.1109/TNN.2010.2087769
|
| [4] | D. Dua, C. Graff, P. Tüfekci, H. Kaya, Combined cycle power plant data set, 2014. |
| [5] |
U. Gupta, D. Gupta, On regularization based twin support vector regression with huber loss (rhn-tsvr), Neural Proc. Lett., 53 (2021), 459–515. https://doi.org/10.1007/s11063-020-10380-y doi: 10.1007/s11063-020-10380-y
|
| [6] | C. W. Hsu, C. C. Chang, C. J. Lin, A practical guide to support vector classification, Technical report, Department of Computer Science, National Taiwan University, 2010. Available from: https://www.csie.ntu.edu.tw/cjlin/papers/guide/guide.pdf |
| [7] | P. J. Huber, Robust estimation of a location parameter, Ann. Math. Stat., 35 (1964), 73–101. |
| [8] |
Y. Jia, Y. Wang, X. Ma, C. Zhang, Q. Guo, D. Li, A quadratic $\nu$-support vector regression approach for load forecasting, Complex Intell. Syst., 11 (2025), 123. https://doi.org/10.1007/s40747-024-01730-7 doi: 10.1007/s40747-024-01730-7
|
| [9] |
A. Kocaoğlu, Efficient optimization of a support vector regression model with natural logarithm of the hyperbolic cosine loss function for broader noise distribution, Appl. Sci., 14 (2024), 3641. https://doi.org/10.3390/app14093641 doi: 10.3390/app14093641
|
| [10] |
H. Luo, S. German Paal, A novel outlier-insensitive local support vector machine for robust data-driven forecasting in engineering, Eng. Comput., 39 (2023), 3671–3689. https://doi.org/10.1007/s00366-022-01781-9 doi: 10.1007/s00366-022-01781-9
|
| [11] |
R. Muthukrishnan, S. Kalaivani, Robust weighted support vector regression approach for predictive modeling, Indian J. Sci. Technol., 16 (2023), 2287–2296. https://doi.org/10.17485/IJST/v16i30.1180 doi: 10.17485/IJST/v16i30.1180
|
| [12] | J. Nishiguchi, C. Kaseda, H. Nakayama, M. Arakawa, Y. Yun, Practical approach to outlier detection using support vector regression, In: Advances in Neuro-Information Processing, Pt I, 2009,995–1002. https://doi.org/10.1007/978-3-642-02490-0_121 |
| [13] |
M. Sabzekar, S. M. H. Hasheminejad, Robust regression using support vector regressions, Chaos Solitons Fract., 144 (2021), 110738. https://doi.org/10.1016/j.chaos.2021.110738 doi: 10.1016/j.chaos.2021.110738
|
| [14] |
M. Shi, Y. Xiao, An efficient dual admm for huber regression with fused lasso penalty, Commun. Stat. Simul. Comput., 2025. https://doi.org/10.1080/03610918.2025.2453816 doi: 10.1080/03610918.2025.2453816
|
| [15] |
A. J. Smola, B. Schölkopf, A tutorial on support vector regression, Stat. Comput., 14 (2004), 199–222. https://doi.org/10.1023/B:STCO.0000035301.49549.88 doi: 10.1023/B:STCO.0000035301.49549.88
|
| [16] |
B. Sun, X. Liu, Significance support vector regression for image denoising, Entropy, 23 (2021), 1262. https://doi.org/10.3390/e23091233 doi: 10.3390/e23091233
|
| [17] | H. A. Taha, Operations Research: An Introduction, 10 Eds., New York: Macmillan, 2017. |
| [18] |
T. Uemoto, K. Naito, Support vector regression with penalized likelihood, Comput. Stat. Data Anal., 174 (2022), 107522. https://doi.org/10.1016/j.csda.2022.107522 doi: 10.1016/j.csda.2022.107522
|
| [19] | V. Vapnik, The Nature of Statistical Learning Theory, Berlin: Springer, 1995. |
| [20] |
T. Wang, X. Lai, J. Cao, A highly efficient admm-based algorithm for outlier-robust regression with huber loss, Appl. Intell., 54 (2024), 5147–5166. https://doi.org/10.1007/s10489-024-05370-9 doi: 10.1007/s10489-024-05370-9
|
| [21] |
H. Wei, J. He, Support vector regression model with variant tolerance, Measur. Control, 56 (2023), 1135–1143. https://doi.org/10.1177/00202940231180620 doi: 10.1177/00202940231180620
|
| [22] |
Y. Zhang, Q. Zhou, B. Du, An algorithm of detecting outliers in svr, Int. J. Wavel. Multiresol. Inform. Proc., 10 (2012), 2012. https://doi.org/10.1142/S0219691312500476 doi: 10.1142/S0219691312500476
|