Research article Special Issues

A matrix-free DFP-like optimization method for problems arising from compressive sensing

  • Published: 02 July 2025
  • This paper introduces a matrix-free variant of the Davidon-Fletcher-Powell (DFP) method for unconstrained optimization problems with applications in compressive sensing and image restoration. The main contribution lies in the new search direction incorporating a scaling parameter that ensures the satisfaction of the sufficient descent condition, independent of the line search conditions. A rigorous convergence analysis guarantees the boundedness and theoretical validity of the proposed method. Comprehensive numerical experiments on benchmark unconstrained optimization test problems and compressive sensing problems demonstrate the efficiency and robustness of the algorithm. Specifically, in image restoration tasks, our method outperforms CG-DESCENT, MDL, and NSMA, achieving a 100% success rate compared to 95.8%, 84.5%, and 53.5%, respectively. Additionally, results on computational time, relative error, and PSNR confirm the superior performance of the proposed approach. These findings establish the proposed method as a competitive alternative for large-scale optimization problems.

    Citation: Aliyu Muhammad Awwal, Sulaiman M. Ibrahim, Issam A. R. Moghrabi, Mahmoud M. Yahaya, Aishatu I. Ahmad, Semiu Oladipupo Oladejo, Karimov Javlon Kuzievich, Aceng Sambas. A matrix-free DFP-like optimization method for problems arising from compressive sensing[J]. Electronic Research Archive, 2025, 33(7): 4091-4118. doi: 10.3934/era.2025183

    Related Papers:

  • This paper introduces a matrix-free variant of the Davidon-Fletcher-Powell (DFP) method for unconstrained optimization problems with applications in compressive sensing and image restoration. The main contribution lies in the new search direction incorporating a scaling parameter that ensures the satisfaction of the sufficient descent condition, independent of the line search conditions. A rigorous convergence analysis guarantees the boundedness and theoretical validity of the proposed method. Comprehensive numerical experiments on benchmark unconstrained optimization test problems and compressive sensing problems demonstrate the efficiency and robustness of the algorithm. Specifically, in image restoration tasks, our method outperforms CG-DESCENT, MDL, and NSMA, achieving a 100% success rate compared to 95.8%, 84.5%, and 53.5%, respectively. Additionally, results on computational time, relative error, and PSNR confirm the superior performance of the proposed approach. These findings establish the proposed method as a competitive alternative for large-scale optimization problems.



    加载中


    [1] P. Singh, S. S. Bose, A quantum-clustering optimization method for COVID-19 CT scan image segmentation, Expert Syst. Appl., 185 (2021), 115637. https://doi.org/10.1016/j.eswa.2021.115637 doi: 10.1016/j.eswa.2021.115637
    [2] P. Singh, M. K. Muchahari, Solving multi-objective optimization problem of convolutional neural network using fast forward quantum optimization algorithm: Application in digital image classification, Adv. Eng. Software, 176 (2023), 103370. https://doi.org/10.1016/j.advengsoft.2022.103370 doi: 10.1016/j.advengsoft.2022.103370
    [3] N. Salihu, P. Kumam, I. M. Sulaiman, I. Arzuka, W. Kumam, An efficient newton-like conjugate gradient method with restart strategy and its application, Math. Comput. Simul., 226 (2024), 354–372. https://doi.org/10.1016/j.matcom.2024.07.008 doi: 10.1016/j.matcom.2024.07.008
    [4] M. Malik, S. S. Abas, M. Mamat, I. S. Mohammed, A new hybrid conjugate gradient method with global convergence properties, Int. J. Adv. Sci. Technol., 29 (2020), 199–210.
    [5] W. Sun, Y. Yuan, Optimization Theory and Methods: Nonlinear Programming, Springer, 2006.
    [6] A. M. Awwal, P. Kumam, K. Sitthithakerngkiet, A. M. Bakoji, A. S. Halilu, I. M. Sulaiman, Derivative-free method based on dfp updating formula for solving convex constrained nonlinear monotone equations and application, AIMS Math., 6 (2021), 8792–8814. https://doi.org/10.3934/math.2021510 doi: 10.3934/math.2021510
    [7] L. Zhang, W. Zhou, D. Li, A descent modified polak–ribière–polyak conjugate gradient method and its global convergence, IMA J. Numer. Anal., 26 (2006), 629–640. https://doi.org/10.1093/imanum/drl016 doi: 10.1093/imanum/drl016
    [8] A. M. Awwal, L. Wang, P. Kumam, M. I. Sulaiman, S. Salisu, N. Salihu, et al., Generalized RMIL conjugate gradient method under the strong wolfe line search with application in image processing, Math. Methods Appl. Sci., 46 (2023), 17544–17556. https://doi.org/10.1002/mma.9515 doi: 10.1002/mma.9515
    [9] S. M. Ibrahim, N. Salihu, Two sufficient descent spectral conjugate gradient algorithms for unconstrained optimization with application, Optim. Eng., 26 (2025), 655–679. https://doi.org/10.1007/s11081-024-09899-z doi: 10.1007/s11081-024-09899-z
    [10] A. U. Omesa, S. M. Ibrahim, R. B. Yunus, I. A. Moghrab, M. Y. Waziri, A. Sambas, A brief survey of line search methods for optimization problems, Results Control Optim., 19 (2025), 100550. https://doi.org/10.1016/j.rico.2025.100550 doi: 10.1016/j.rico.2025.100550
    [11] N. Andrei, Scaled conjugate gradient algorithms for unconstrained optimization, Comput. Optim. Appl., 38 (2007), 401–416. https://doi.org/10.1007/s10589-007-9055-7 doi: 10.1007/s10589-007-9055-7
    [12] P. Wolfe, Convergence conditions for ascent methods, SIAM Rev., 11 (1969), 226–235. https://doi.org/10.1137/1011036 doi: 10.1137/1011036
    [13] P. Wolfe, Convergence conditions for ascent methods. ii: Some corrections, SIAM Rev., 13 (1971), 185–188. https://doi.org/10.1137/1013035 doi: 10.1137/1013035
    [14] G. Zoutendijk, Nonlinear programming, computational methods, Integer Nonlinear Program., 1970 (1970), 37–86.
    [15] W. W. Hager, H. Zhang, Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent, ACM Trans. Math. Software, 32 (2006), 113–137. https://doi.org/10.1145/1132973.1132979 doi: 10.1145/1132973.1132979
    [16] J. Zhang, Y. Xiao, Z. Wei, Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization, Math. Probl. Eng., 2009 (2009), 243290. https://doi.org/10.1155/2009/243290 doi: 10.1155/2009/243290
    [17] M. Jourak, S. Nezhadhosein, F. Rahpeymaii, A new self-scaling memoryless quasi-newton update for unconstrained optimization, 4OR, 22 (2024), 235–252. https://doi.org/10.1007/s10288-023-00544-6 doi: 10.1007/s10288-023-00544-6
    [18] N. Andrei, An unconstrained optimization test functions collection, Adv. Model. Optim., 10 (2008), 147–161.
    [19] E. D. Dolan, J. J. Moré, Benchmarking optimization software with performance profiles, Math. Program., 91 (2002), 201–213. https://doi.org/10.1007/s101070100263 doi: 10.1007/s101070100263
    [20] B. Sun, H. Feng, K. Chen, X. Zhu, A deep learning framework of quantized compressed sensing for wireless neural recording, IEEE Access, 4 (2016), 5169–5178. https://doi.org/10.1109/ACCESS.2016.2604397 doi: 10.1109/ACCESS.2016.2604397
    [21] I. F. Akyildiz, M. C. Vuran, Wireless Sensor Networks, John Wiley & Sons, 2010.
    [22] Y. J. J. Bagul, A smooth transcendental approximation to |x|, Int. J. Math. Sci. Eng. Appls. (IJMSEA), 11 (2017), 213–217.
    [23] A. M. Awwal, M. M. Yahaya, N. Pakkaranang, N. Pholasa, A new variant of the conjugate descent method for solving unconstrained optimization problems and applications, Mathematics (2227-7390), 12 (2024), 2430. https://doi.org/10.3390/math12152430 doi: 10.3390/math12152430
    [24] Z. Aminifard, S. Babaie-Kafaki, A nonmonotone admm-based diagonal quasi-newton update with application to the compressive sensing problem, Math. Model. Anal., 28 (2023), 673–688. https://doi.org/10.3846/mma.2023.16993 doi: 10.3846/mma.2023.16993
    [25] Z. Aminifard, A. Hosseini, S. Babaie-Kafaki, Modified conjugate gradient method for solving sparse recovery problem with nonconvex penalty, Signal Process., 193 (2022), 108424. https://doi.org/10.1016/j.sigpro.2021.108424 doi: 10.1016/j.sigpro.2021.108424
    [26] N. Salihu, P. Kumam, S. M. Ibrahim, W. Kumam, Some combined techniques of spectral conjugate gradient methods with applications to robotic and image restoration models, Numer. Algor., 2024 (2024), 1–41. https://doi.org/10.1007/s11075-024-01970-1 doi: 10.1007/s11075-024-01970-1
    [27] Y. Liu, Z. Zhu, B. Zhang, Two sufficient descent three-term conjugate gradient methods for unconstrained optimization problems with applications in compressive sensing, J. Appl. Math. Comput., 68 (2022), 1787–1816. https://doi.org/10.1007/s12190-021-01589-8 doi: 10.1007/s12190-021-01589-8
    [28] X. Jiang, W. Liao, J. Yin, J. Jian, A new family of hybrid three-term conjugate gradient methods with applications in image restoration, Numer. Algor., 91 (2022), 161–191. https://doi.org/10.1007/s11075-022-01258-2 doi: 10.1007/s11075-022-01258-2
  • Reader Comments
  • © 2025 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(763) PDF downloads(50) Cited by(0)

Article outline

Figures and Tables

Figures(16)  /  Tables(6)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog