Research article

Randomized symmetric Gauss-Seidel method for solving linear least squares problems

  • Received: 30 March 2024 Revised: 08 May 2024 Accepted: 14 May 2024 Published: 23 May 2024
  • MSC : 65F10

  • We introduced a random symmetric Gauss-Seidel (RSGS) method, which was designed to handle large scale linear least squares problems involving tall coefficient matrices. This RSGS method projected the approximate residual onto the subspace spanned by two symmetric columns at each iteration. These columns were sampled from the coefficient matrix based on an effective probability criterion. Our theoretical analysis indicated that RSGS converged when the coefficient matrix had full column rank. Furthermore, numerical experiments demonstrated that RSGS outperformed the baseline algorithms in terms of iteration steps and CPU time.

    Citation: Fan Sha, Jianbing Zhang. Randomized symmetric Gauss-Seidel method for solving linear least squares problems[J]. AIMS Mathematics, 2024, 9(7): 17453-17463. doi: 10.3934/math.2024848

    Related Papers:

  • We introduced a random symmetric Gauss-Seidel (RSGS) method, which was designed to handle large scale linear least squares problems involving tall coefficient matrices. This RSGS method projected the approximate residual onto the subspace spanned by two symmetric columns at each iteration. These columns were sampled from the coefficient matrix based on an effective probability criterion. Our theoretical analysis indicated that RSGS converged when the coefficient matrix had full column rank. Furthermore, numerical experiments demonstrated that RSGS outperformed the baseline algorithms in terms of iteration steps and CPU time.



    加载中


    [1] G. H. Golub, C. F. V. Loan, Matrix computations, Johns Hopkins University Press, 2013.
    [2] A. Hefny, D. Needell, A. Ramdas, Rows versus columns: Randomized Kaczmarz or Gauss-Seidel for ridge regression, SIAM J. Sci. Comput., 39 (2017), S528–S542. https://doi.org/10.1137/16M1077891 doi: 10.1137/16M1077891
    [3] Y. Saad, Iterative methods for sparse linear systems, Society for Industrial and Applied Mathematics, 2003.
    [4] T. Strohmer, R. Vershynin, A randomized Kaczmarz algorithm with exponential convergence, J. Fourier Anal. Appl., 15 (2009), 262–278. https://doi.org/10.1007/s00041-008-9030-4 doi: 10.1007/s00041-008-9030-4
    [5] Z. Z. Bai, W. T. Wu, On greedy randomized coordinate descent methods for solving large linear least-squares problems, Numer. Linear Algebra Appl., 26 (2019), e2237. https://doi.org/10.1002/nla.2237 doi: 10.1002/nla.2237
    [6] K. Du, Tight upper bounds for the convergence of the randomized extended Kaczmarz and Gauss-Seidel algorithms, Numer. Linear Algebra Appl., 26 (2019), e2233. https://doi.org/10.1002/nla.2233 doi: 10.1002/nla.2233
    [7] R. M. Gower, P. Richt{á}rik, Randomized iterative methods for linear systems, SIAM J. Matrix Anal. A., 36 (2015), 1660–1690. https://doi.org/10.1137/15M1025487 doi: 10.1137/15M1025487
    [8] Y. Q. Niu, B. Zheng, A new randomized Gauss-Seidel method for solving linear least-squares problems, Appl. Math. Lett., 116 (2021), 107057. https://doi.org/10.1016/j.aml.2021.107057 doi: 10.1016/j.aml.2021.107057
    [9] J. Nutini, M. Schmidt, I. Laradji, M. Friedlander, H. Koepke, Coordinate descent converges faster with the gauss-southwell rule than random selection, Proc. Int. Conf. Mach. Learn., 2015, 1632–1641. http://proceedings.mlr.press/v37/nutini15.pdf
    [10] D. Leventhal, A. S. Lewis, Randomized methods for linear constraints: Convergence rates and conditioning, Math. Oper. Res., 35 (2010), 641–654. https://doi.org/10.1287/moor.1100.0456 doi: 10.1287/moor.1100.0456
    [11] A. Ma, D. Needell, A. Ramdas, Convergence properties of the randomized extended Gauss-Seidel and Kaczmarz methods, SIAM J. Matrix Anal. Appl., 36 (2015), 1590–1604. https://doi.org/10.1137/15M1014425 doi: 10.1137/15M1014425
    [12] Y. Liu, X. L. Jiang, C. Q. Gu, On maximum residual block and two-step Gauss-Seidel algorithms for linear least-squares problems, Calcolo, 58 (2015), 1–32. https://doi.org/10.1007/s10092-021-00404-x doi: 10.1007/s10092-021-00404-x
    [13] Y. M. Liao, T. X. Lu, F. Yin, A two-step randomized Gauss-Seidel method for solving large-scale linear least squares problems, Electron. Res. Arch., 30 (2022), 755–779. https://doi.org/10.3934/era.2022040 doi: 10.3934/era.2022040
    [14] A. Mustafa, M. Saha, A two-dimensional randomized extended Gauss-Seidel algorithm for solving least squares problems, Numer. Algor., (2023), 1–22. https://doi.org/10.1007/s11075-023-01661-3
    [15] Y. J. Guan, W. G. Li, L. L. Xing, T. T. Qiao, A note on convergence rate of randomized Kaczmarz method, Calcolo, 57 (2020), 1–11. https://doi.org/10.1007/s10092-020-00376-4 doi: 10.1007/s10092-020-00376-4
    [16] T. A. Davis, Y. F. Hu, The university of Florida sparse matrix collection, Acm T. Math. Software, 38 (2011), 1–25. https://doi.org/10.1145/2049662.2049663 doi: 10.1145/2049662.2049663
  • Reader Comments
  • © 2024 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(121) PDF downloads(13) Cited by(0)

Article outline

Figures and Tables

Figures(2)  /  Tables(1)

Other Articles By Authors

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog