Research article

A single-projection proximal algorithm for stochastic mixed variational inequalities with applications to breast cancer screening

  • Published: 17 April 2026
  • MSC : 47H25, 54E70, 65K15, 90C33, 90C15

  • We studied a stochastic mixed variational inequality problem (SMVIP) that encompasses stochastic optimization, stochastic variational inequality problems, and a composite convex minimization problem as special cases. To solve this problem, we proposed a single-projection proximal algorithm (SiPPA) that combined golden ratio dynamics with an adaptive stepsize strategy. In contrast to classical stochastic extragradient and subgradient extragradient methods, the proposed algorithm required only one projection and one averaged stochastic oracle call per iteration, resulting in reduced computational cost. Under mild assumptions on the stochastic oracle and monotonicity of the expected operator, we established almost sure convergence of the generated sequence. Moreover, when the operator was strongly monotone, we proved that the algorithm converges at an $ R- $linear rate. Numerical experiments on benchmark problems and real-world learning tasks on breast cancer screening, illustrate the effectiveness and efficiency of the proposed approach relative to existing stochastic methods.

    Citation: Mohammad Dilshad, Ibrahim Al-Dayel, Francis O. Nwawuru, Praveen Agarwal. A single-projection proximal algorithm for stochastic mixed variational inequalities with applications to breast cancer screening[J]. AIMS Mathematics, 2026, 11(4): 10533-10565. doi: 10.3934/math.2026434

    Related Papers:

  • We studied a stochastic mixed variational inequality problem (SMVIP) that encompasses stochastic optimization, stochastic variational inequality problems, and a composite convex minimization problem as special cases. To solve this problem, we proposed a single-projection proximal algorithm (SiPPA) that combined golden ratio dynamics with an adaptive stepsize strategy. In contrast to classical stochastic extragradient and subgradient extragradient methods, the proposed algorithm required only one projection and one averaged stochastic oracle call per iteration, resulting in reduced computational cost. Under mild assumptions on the stochastic oracle and monotonicity of the expected operator, we established almost sure convergence of the generated sequence. Moreover, when the operator was strongly monotone, we proved that the algorithm converges at an $ R- $linear rate. Numerical experiments on benchmark problems and real-world learning tasks on breast cancer screening, illustrate the effectiveness and efficiency of the proposed approach relative to existing stochastic methods.



    加载中


    [1] L. Bottou, Large-scale machine learning with stochastic gradient descent, Proceedings of COMPSTAT 2010: 19th International Conference on Computational Statistics, 2010,177–186.
    [2] D. P. Kingma, J. Ba, Adam: A method for stochastic optimization, Proceedings of the 3rd International Conference on Learning Representations (ICLR), 2015.
    [3] G. B. Dantzig, Linear programming under uncertainty, Manag. Sci., 1 (1955), 197–206.
    [4] G. C. Pflug, A. Pichler, Multistage stochastic optimization, Springer: Cham, 2014.
    [5] G. M. Korpelevich, The extragradient method for finding saddle points and other problems, Ekon. Mat. Metody, 12 (1976), 747–756.
    [6] Y. C. Censor, A. Gibali, S. Reich, The subgradient extragradient method for solving variational inequalities in Hilbert spaces, J. Optim. Theory Appl., 148 (2011), 318–335. http://doi.org/10.1007/s10957-010-9733-4 doi: 10.1007/s10957-010-9733-4
    [7] F. O. Nwawuru, G. N. Echezona, C. C. Okeke, Finding a common solution of variational inequality and fixed-point problems using subgradient extragradient techniques, Rend. Circ. Mat. Palermo Ser. 2, 73 (2023), 1255–1275. http://doi.org/10.1007/s12215-023-00978-1 doi: 10.1007/s12215-023-00978-1
    [8] L. Liu, B. Tan, S. Y. Cho, On the resolution of variational inequality problems with a double-hierarchical structure, J. Nonlinear Convex Anal., 21 (2020), 377–386.
    [9] L. V. Nguyen, Q. H. Ansari, X. Qin, Weak sharpness and finite convergence for solutions of nonsmooth variational inequalities in Hilbert spaces, Appl. Math. Optim., 84 (2021), 807–828.
    [10] M. Dilshad, I. Al-Dayel, F. O. Nwawuru, J. N. Ezeora, On quasi-monotone stochastic variational inequalities with applications, Axioms, 14 (2025), 912. http://doi.org/10.3390/axioms14120912 doi: 10.3390/axioms14120912
    [11] X. J. Zhang, X. W. Du, Z. P. Yang, G. H. Lin, An infeasible stochastic approximation and projection algorithm for stochastic variational inequalities, J. Optim. Theory Appl., 183 (2019), 1053–1076.
    [12] L. Liu, X. Qin, An accelerated stochastic extragradient-like algorithm with new stepsize rules for stochastic variational inequalities, Comput. Math. Appl., 163 (2024), 117–135.
    [13] S. Wang, H. Tao, R. Lin, Y. J. Cho, A self-adaptive stochastic subgradient extragradient algorithm for the stochastic pseudomonotone variational inequality problem with application, Z. Angew. Math. Phys., 73 (2022), 164.
    [14] X. J. Long, Y. H. He, A fast stochastic approximation-based subgradient extragradient algorithm with variance reduction for solving stochastic variational inequality problems, J. Comput. Appl. Math., 420 (2023), 114786.
    [15] Y. Malitsky, Golden ratio algorithms for variational inequalities, Math. Program., 184 (2019), 383–410. http://doi.org/10.1007/s10107-019-01373-6 doi: 10.1007/s10107-019-01373-6
    [16] Z. P. Yang, G. H. Lin, Variance-based single-call proximal extragradient algorithms for stochastic mixed variational inequalities, J. Optim. Theory Appl., 190 (2021), 393–427.
    [17] H. Robbins, S. Monro, A stochastic approximation method, Ann. Math. Stat., 22 (1951), 400–407.
    [18] Y. C. Liang, Q. N. Fan, P. P. Shen, A hybrid Newton method for stochastic variational inequality problems and application to traffic equilibrium, Asia-Pac. J. Oper. Res., 38 (2021), 2050036.
    [19] H. Jiang, H. Xu, Stochastic approximation approaches to the stochastic variational inequality problem, IEEE Trans. Autom. Control, 53 (2008), 1462–1475.
    [20] R. Y. Rubinstein, D. P. Kroese, The cross-entropy method: A unified approach to combinatorial optimization, Monte-Carlo simulation and machine learning, New York: Springer, 2004.
    [21] S. X. He, P. Zhang, X. Hu, R. Hu, A sample average approximation method based on a D-gap function for stochastic variational inequality problems, J. Ind. Manag. Optim., 10 (2014), 977–987.
    [22] A. N. Iusem, A. Jofre, R. I. Oliveira, P. Thompson, Variance-based extragradient methods with line search for stochastic variational inequalities, SIAM J. Optim., 29 (2019), 175–206.
    [23] B. T. Polyak, Some methods of speeding up the convergence of iteration methods, USSR Comput. Math. Math. Phys., 4 (1964), 1–17.
    [24] F. O. Nwawuru, O. Narian, M. Dilshad, J. N. Ezeora, Splitting method involving two-step inertial for solving inclusion and fixed-point problems with applications, Fixed Point Theory Algorithms Sci. Eng., 2025 (2025), 8. http://doi.org/10.1186/s13663-025-00781-w doi: 10.1186/s13663-025-00781-w
    [25] C. D. Enyi, J. N. Ezeora, G. C. Ugwunnadi, F. O. Nwawuru, S. E. Mukiawa, Generalized split feasibility problem: Solution by iteration, Carpathian J. Math., 40 (2024), 655–679. http://doi.org/10.37193/CJM.2024.03.08 doi: 10.37193/CJM.2024.03.08
    [26] F. O. Nwawuru, J. N. Ezeora, H. ur Rehman, J.-C. Yao, Self-adaptive subgradient extragradient algorithm for solving equilibrium and fixed-point problems in Hilbert spaces, Numer. Algorithms, 101 (2025), 1567–1605. http://doi.org/10.1007/s11075-025-02055-3 doi: 10.1007/s11075-025-02055-3
    [27] F. O. Nwawuru, J. N. Ezeora, Inertial-based extragradient algorithm for approximating a common solution of split-equilibrium problems and fixed-point problems of nonexpansive semigroups, J. Inequal. Appl., 2023 (2023), 22. http://doi.org/10.1186/s13660-023-02923-3 doi: 10.1186/s13660-023-02923-3
    [28] J. N. Ezeora, C. D. Enyi, F. O. Nwawuru, R. C. Ogbonna, An algorithm for split equilibrium and fixed-point problems using inertial extragradient techniques, Comput. Appl. Math., 42 (2023), 103. http://doi.org/10.1007/s40314-023-02244-7 doi: 10.1007/s40314-023-02244-7
    [29] J. Chen, S. Liu, X. Chang, Extragradient method and golden ratio method for equilibrium problems on Hadamard manifolds, Int. J. Comput. Math., 98 (2021), 1699–1712. http://doi.org/10.1080/00207160.2020.1861616 doi: 10.1080/00207160.2020.1861616
    [30] M. K. Tam, D. J. Uteda, Bregman–golden ratio algorithms for variational inequalities, J. Optim. Theory Appl., 199 (2023), 993–1021. http://doi.org/10.1007/s10957-023-02320-2 doi: 10.1007/s10957-023-02320-2
    [31] O. K. Oyewole, H. A. Abass, O. J. Ogunsola, An improved subgradient extragradient self-adaptive algorithm based on the golden ratio technique for variational inequality problems in Banach spaces, J. Comput. Appl. Math., 460 (2025), 116420.
    [32] H. H. Bauschke, P. L. Combettes, Correction to: Convex analysis and monotone operator theory in Hilbert spaces, Cham: Springer, 2020. http://doi.org/10.1007/978-3-319-48311-5
    [33] H. Robbins, D. Siegmund, A convergence theorem for nonnegative almost supermartingales and some applications, Proc. Sixth Berkeley Symp. Math. Statist. Probab., 3 (1972), 157–163.
    [34] H. Liu, J. Yang, Weak convergence of iterative methods for solving quasimonotone variational inequalities, Comput. Optim. Appl., 77 (2020), 491–508.
    [35] P. T. Hoai, N. T. Vinh, N. P. H. Chung, A novel stepsize for gradient descent methods, Oper. Res. Lett., 53 (2024), 107072. http://doi.org/10.1016/j.orl.2024.107072 doi: 10.1016/j.orl.2024.107072
    [36] L. V. Kantorovich, G. P. Akilov, Functional analysis in normed spaces, Oxford: Pergamon Press, 1964.
    [37] Z. Q. Luo, P. Tseng, Error bounds and convergence analysis of feasible descent methods: A general approach, Ann. Oper. Res., 46 (1993), 157–178.
    [38] O. Güler, New proximal point algorithms for convex minimization, SIAM J. Optim., 2 (1992), 649–664.
    [39] F. Giannessi, On Minty variational principle, New Trends in Mathematical Programming: Homage to Steven Vajda, Boston: Springer, 1998, 93–99.
    [40] G. Durhan, A. Azizova, Ö. Önder, K. Kösemehmetoğlu, J. Karakaya, M. G. Akpınar, et al., Imaging findings and clinicopathological correlation of breast cancer in women under 40 years old, Eur. J. Breast Health, 15 (2019), 147–154.
    [41] P. Sarnmeta, W. Inthakon, D. Chumpungam, S. Suantai, On convergence and complexity analysis of an accelerated forward–backward algorithm with linesearch technique for convex minimization problems and applications to data prediction and classification, J. Inequal. Appl., 2021 (2021), 1–20.
    [42] P. Nabheerong, W. Kiththiworaphongkich, W. Cholamjiak, Breast cancer screening using a modified inertial projective algorithm for split feasibility problems, Int. J. Breast Cancer, 2023 (2023), 2060375.
    [43] P. Peeyada, H. Dutta, K. Shiangjen, W. Cholamjiak, A modified forward–backward splitting method for the sum of two monotone operators with applications to breast cancer prediction, Math. Methods Appl. Sci., 46 (2023), 1251–1265.
    [44] L. J. Grimm, C. S. Avery, E. Hendrick, J. A. Baker, Benefits and risks of mammography screening in women ages 40 to 49 years, J. Prim. Care Community Health, 13 (2022), 215013272110583.
    [45] V. A. Kumari, R. Chitra, Classification of diabetes disease using support vector machine, Int. J. Eng. Res. Appl., 3 (2013), 1797–1801.
  • Reader Comments
  • © 2026 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(211) PDF downloads(26) Cited by(0)

Article outline

Figures and Tables

Figures(3)  /  Tables(12)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog