Research article Special Issues

A structured RMIL conjugate gradient-based strategy for nonlinear least squares with applications in image restoration problems

  • Published: 30 June 2025
  • MSC : 65K05, 90C52, 90C53, 90C56

  • Numerous scientific and technical domains have found use for nonlinear least squares (NLS). Conventional approaches to NLS problem solving often suffer from computational inefficiencies and high memory requirements, particularly when applied to large-scale systems. This paper presents the structured conjugate gradient coefficient using a structured secant-like approximation to solve NLS problems. The approach develops a structured vector approximation that captures the vector-matrix relationship using Taylor series expansions of the Hessian of the goal function. It is possible to incorporate more Hessian information into the traditional search direction, since this approximation satisfies a quasi-Newton condition. Furthermore, given conventional assumptions, a global convergence study is performed. Benchmark NLS tasks are used for numerical studies to assess the method's performance against alternative approaches. The results demonstrate the promise and success of the approach. Lastly, problems with image restoration are addressed using the suggested technique.

    Citation: Rabiu Bashir Yunus, Ahmed R. El-Saeed, Nooraini Zainuddin, Hanita Daud. A structured RMIL conjugate gradient-based strategy for nonlinear least squares with applications in image restoration problems[J]. AIMS Mathematics, 2025, 10(6): 14893-14916. doi: 10.3934/math.2025668

    Related Papers:

  • Numerous scientific and technical domains have found use for nonlinear least squares (NLS). Conventional approaches to NLS problem solving often suffer from computational inefficiencies and high memory requirements, particularly when applied to large-scale systems. This paper presents the structured conjugate gradient coefficient using a structured secant-like approximation to solve NLS problems. The approach develops a structured vector approximation that captures the vector-matrix relationship using Taylor series expansions of the Hessian of the goal function. It is possible to incorporate more Hessian information into the traditional search direction, since this approximation satisfies a quasi-Newton condition. Furthermore, given conventional assumptions, a global convergence study is performed. Benchmark NLS tasks are used for numerical studies to assess the method's performance against alternative approaches. The results demonstrate the promise and success of the approach. Lastly, problems with image restoration are addressed using the suggested technique.



    加载中


    [1] H. Kim, C. Wang, H. Byun, W. Hu, S. Kim, Q. Jiao, et al., Variable three-term conjugate gradient method for training artificial neural networks, Neural Networks, 159 (2023), 125–136. https://doi.org/10.1016/j.neunet.2022.12.001 doi: 10.1016/j.neunet.2022.12.001
    [2] N. Sato, K. Izumi, H. Iiduka, Scaled conjugate gradient method for nonconvex optimization in deep neural networks, J. Mach. Learn. Res., 25 (2024), 1–37.
    [3] S. J. Wright, R. D. Nowak, M. A. T. Figueiredo, Sparse reconstruction by separable approximation, IEEE Trans. Signal Process., 57 (2009), 2479–2493. https://doi.org/10.1109/TSP.2009.2016892 doi: 10.1109/TSP.2009.2016892
    [4] M. M. Yahaya, P. Kumam, A. M. Awwal, P. Chaipunya, S. Aji, and S. Salisu, A New Generalized Quasi-Newton Algorithm Based on Structured Diagonal Hessian Approximation for Solving Nonlinear Least-Squares Problems With Application to 3DOF Planar Robot Arm Manipulator, IEEE Access, 10 (2022), 10816–10826. https://doi.org/10.1109/ACCESS.2022.3144875 doi: 10.1109/ACCESS.2022.3144875
    [5] W. Xu, N. Zheng, K. Hayami, Jacobian-free implicit inner-iteration preconditioner for nonlinear least squares problems, J. Sci. Comput., 68 (2016), 1055–1081. https://doi.org/10.1007/s10915-016-0167-z doi: 10.1007/s10915-016-0167-z
    [6] H. Zhang, A. R. Conn, K. Scheinberg, A derivative-free algorithm for least-squares minimization, SIAM J. Optim., 20 (2010), 3555–3576. https://doi.org/10.1137/09075531X doi: 10.1137/09075531X
    [7] H. Mohammad, M. Y. Waziri, S. A. Santos, A brief survey of methods for solving nonlinear least-squares problems, Numer. Algebra Control Optim., 9 (2019), 1–13. https://doi.org/10.3934/naco.2019001 doi: 10.3934/naco.2019001
    [8] K. Madsen, H. B. Nielsen, O. Tingleff, Methods for Non-linear Least Squares Problems, Lyngby: Informatics and Mathematical Modelling Technical University of Denmark, 2004.
    [9] R. B. Yunus, N. Zainuddin, H. Daud, R. Kannan, M. M. Yahaya, A. Al-Yaari, An improved accelerated 3-term conjugate gradient algorithm with second-order Hessian approximation for nonlinear least-squares optimization, J. Math. Comput. Sci., 36 (2025), 263–274. https://doi.org/10.22436/jmcs.036.03.02 doi: 10.22436/jmcs.036.03.02
    [10] R. B. Yunus, N. Zainuddin, H. Daud, R. Kannan, S. A. Abdul Karim, M. M. Yahaya, A modified structured spectral HS method for nonlinear least squares problems and applications in robot arm control, Mathematics, 11 (2023), 3215. https://doi.org/10.3390/math11143215 doi: 10.3390/math11143215
    [11] F. Ding, L. Xu, X. Zhang, Y. Zhou, X. Luan, Recursive identification methods for general stochastic systems with colored noises using the hierarchical identification principle and the filtering identification idea, Annu. Rev.. Control, 57 (2024), 100942. https://doi.org/10.1016/j.arcontrol.2024.100942 doi: 10.1016/j.arcontrol.2024.100942
    [12] F. Ding, Least squares parameter estimation and multi-innovation least squares methods for linear fitting problems from noisy data, J. Comput. Appl. Math., 426 (2023), 115107. https://doi.org/10.1016/j.cam.2023.115107 doi: 10.1016/j.cam.2023.115107
    [13] M. R. Hestenes, E. Stiefel, Methods of conjugate gradients for solving linear systems, J. Res. Natl. Bur. Stand., 49 (1952), 409–436. https://doi.org/10.6028/jres.049.044 doi: 10.6028/jres.049.044
    [14] E. Polak, G. Ribiere, Note on the convergence of conjugate direction methods, ESAIM Math. Model. Numer. Anal., 3 (1969), 35–43. https://doi.org/10.1007/BF01078750 doi: 10.1007/BF01078750
    [15] B. T. Polyak, The conjugate gradient method in extremal problems, USSR Comput. Math. Math. Phys., 9 (1969), 94–112. https://doi.org/10.1016/0041-5553(69)90035-4 doi: 10.1016/0041-5553(69)90035-4
    [16] Y. Liu, C. Storey, Efficient generalized conjugate gradient algorithms. Part 1: Theory, J. Optim. Theory Appl., 69 (1991), 129–137. https://doi.org/10.1007/BF00940464 doi: 10.1007/BF00940464
    [17] W. W. Hager, H. Zhang, A survey of nonlinear conjugate gradient methods, Pacific J. Optim., 2 (2006), 35–58.
    [18] S. Yao, W. Zengxin, H. Hai, A note about WYL's conjugate gradient method and its applications, Appl. Math. Comput., 191 (2006), 381–388.
    [19] R. Fletcher, C. M. Reeves, Function minimization by conjugate gradients, Comput. J., 7 (1964), 149–154. https://doi.org/10.1093/comjnl/7.2.149 doi: 10.1093/comjnl/7.2.149
    [20] R. Fletcher, Practical Methods of Optimization, Chichester: John Wiley & Sons, 2013.
    [21] Y. H. Dai, Y. Yuan, A nonlinear conjugate gradient method with a strong global convergence property, SIAM J. Optim., 10 (1999), 177–182. https://doi.org/10.1137/S1052623497318992 doi: 10.1137/S1052623497318992
    [22] M. Rivaie, M. Mamat, L. W. June, I. Mohd, A new class of nonlinear conjugate gradient coefficients with global convergence properties, Appl. Math. Comput., 218 (2012), 11323–11332. https://doi.org/10.1016/j.amc.2012.05.030 doi: 10.1016/j.amc.2012.05.030
    [23] X. Wu, H. Shao, P. Liu, Y. Zhang, Y. Zhuo, An efficient conjugate gradient-based algorithm for unconstrained optimization and its projection extension to large-scale constrained nonlinear equations with applications in signal recovery and image denoising problems. J. Comput. Appl. Math., 422, (2023), 114879. https://doi.org/10.1016/j.cam.2022.114879
    [24] Z. Dai, Comments on a new class of nonlinear conjugate gradient coefficients with global convergence properties, Appl. Math. Comput., 276, (2016), 297–300. https://doi.org/10.1016/j.amc.2015.11.085 doi: 10.1016/j.amc.2015.11.085
    [25] O. O. O. Yousif, M. A. Saleh, Another modified version of RMIL conjugate gradient method, Appl. Numer. Math., 202 (2024), 120–126.
    [26] O. O. O. Yousif, The convergence properties of RMIL+ conjugate gradient method under the strong Wolfe line search, Appl. Math. Comput., 367 (2020), 124777. https://doi.org/10.1016/j.amc.2019.124777 doi: 10.1016/j.amc.2019.124777
    [27] X. Zhang, F. Ding, Optimal adaptive filtering algorithm by using the fractional-order derivative, IEEE Signal Process. Lett., 29 (2021), 399–403. https://doi.org/10.1109/LSP.2021.3136504 doi: 10.1109/LSP.2021.3136504
    [28] J. Z. Zhang, Y. Xue, K. Zhang, A structured secant method based on a new quasi-Newton equation for nonlinear least squares problems, BIT Numer. Math., 43 (2003), 217–229. https://doi.org/10.1023/A:1023665409152 doi: 10.1023/A:1023665409152
    [29] L. H. Chen, N. Y. Deng, J. Z. Zhang, A modified quasi-Newton method for structured optimization with partial information on the Hessian, Comput. Optim. Appl., 35 (2006), 5–18. https://doi.org/10.1007/s10589-006-6440-6 doi: 10.1007/s10589-006-6440-6
    [30] J. E. Dennis, H. J. Martinez, R. A. Tapia, Convergence theory for the structured BFGS secant method with an application to nonlinear least squares, J. Optim. Theory Appl., 61 (1989), 161–178. https://doi.org/10.1007/BF00962795 doi: 10.1007/BF00962795
    [31] E. Amini, A. G. Rizi, A new structured quasi-Newton algorithm using partial information on Hessian, J. Comput. Appl. Math., 234 (2010), 805–811. https://doi.org/10.1016/j.cam.2010.01.044 doi: 10.1016/j.cam.2010.01.044
    [32] H. Zhang, W. W. Hager, A nonmonotone line search technique and its application to unconstrained optimization, SIAM J. Optim., 14 (2004), 1043–1056. https://doi.org/10.1137/S1052623403428208 doi: 10.1137/S1052623403428208
    [33] R. B. Yunus, N. Zainuddin, H. Daud, R. Kannan, M. M. Yahaya, S. A. A. Karim, New CG-based algorithms with second-order curvature information for NLS problems and a 4DOF arm robot model, IEEE Access, 12 (2024), 61086–61103. https://doi.org/10.1109/ACCESS.2024.3393555 doi: 10.1109/ACCESS.2024.3393555
    [34] W. L. Cruz, J. Martínez, M. Raydan, Spectral residual method without gradient information for solving large-scale nonlinear systems of equations, Math. Comput., 75 (2006), 1429–1448. https://doi.org/10.1090/S0025-5718-06-01840-0 doi: 10.1090/S0025-5718-06-01840-0
    [35] L. Lukšan, J. Vlček, Test problems for unconstrained optimization, Technical Report 897, Academy of Sciences of the Czech Republic, Institute of Computer Science, 2003. Available from: http://www.cs.cas.cz/luksan/test.html.
    [36] J. J. Moré, B. S. Garbow, K. E. Hillstrom, Testing unconstrained optimization software, ACM Trans. Math. Softw. (TOMS), 7 (1981), 17–41.
    [37] J. Liu, S. Li, A projection method for convex constrained monotone nonlinear equations with applications, Comput. Math. Appl., 70 (2015), 2442–2453. https://doi.org/10.1016/j.camwa.2015.09.014 doi: 10.1016/j.camwa.2015.09.014
    [38] M. Jamil, X.-S. Yang, A literature survey of benchmark functions for global optimisation problems, Int. J. Math. Model. Numer. Optim., 4 (2013), 150–194.
    [39] W. W. Hager, H. Zhang, A survey of nonlinear conjugate gradient methods, Pac. J. Optim., 2 (2006), 35–58.
    [40] E. D. Dolan, J. J. Moré, Benchmarking optimization software with performance profiles, Math. Program., 91 (2002), 201–213. https://doi.org/10.1007/s101070100263 doi: 10.1007/s101070100263
    [41] R. H. Chan, C. W. Ho, M. Nikolova, Salt-and-pepper noise removal by median-type noise detectors and detail-preserving regularization, IEEE Trans. Image Process., 14 (2005), 1479–1485. https://doi.org/10.1109/TIP.2005.852196 doi: 10.1109/TIP.2005.852196
    [42] J.-F. Cai, R. Chan, B. Morini, Minimization of an edge-preserving regularization functional by conjugate gradient type methods, Image Processing Based on Partial Differential Equations: Proceedings of the International Conference on PDE-Based Image Processing and Related Inverse Problems, 2007,109–122.
    [43] X. Jiang, W. Liao, J. Yin, J. Jian, A new family of hybrid three-term conjugate gradient methods with applications in image restoration, Numer. Algorithms, 91 (2022), 161–191. https://doi.org/10.1007/s11075-022-01258-2 doi: 10.1007/s11075-022-01258-2
    [44] H. Hwang, R. A. Haddad, Adaptive median filters: new algorithms and results, IEEE Trans. Image Process., 4 (1995), 499–502. https://doi.org/10.1109/83.370679 doi: 10.1109/83.370679
    [45] A. C. Bovik, Handbook of Image and Video Processing, Academic Press, 2010.
  • Reader Comments
  • © 2025 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(641) PDF downloads(40) Cited by(1)

Article outline

Figures and Tables

Figures(6)  /  Tables(2)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog