The paper surveys, classifies and investigates theoretically and numerically main classes of line search methods for unconstrained optimization. Quasi-Newton (QN) and conjugate gradient (CG) methods are considered as representative classes of effective numerical methods for solving large-scale unconstrained optimization problems. In this paper, we investigate, classify and compare main QN and CG methods to present a global overview of scientific advances in this field. Some of the most recent trends in this field are presented. A number of numerical experiments is performed with the aim to give an experimental and natural answer regarding the numerical one another comparison of different QN and CG methods.
Citation: Predrag S. Stanimirović, Branislav Ivanov, Haifeng Ma, Dijana Mosić. A survey of gradient methods for solving nonlinear optimization[J]. Electronic Research Archive, 2020, 28(4): 1573-1624. doi: 10.3934/era.2020115
The paper surveys, classifies and investigates theoretically and numerically main classes of line search methods for unconstrained optimization. Quasi-Newton (QN) and conjugate gradient (CG) methods are considered as representative classes of effective numerical methods for solving large-scale unconstrained optimization problems. In this paper, we investigate, classify and compare main QN and CG methods to present a global overview of scientific advances in this field. Some of the most recent trends in this field are presented. A number of numerical experiments is performed with the aim to give an experimental and natural answer regarding the numerical one another comparison of different QN and CG methods.
[1] |
A new reprojection of the conjugate directions. Numer. Algebra Control Optim. (2019) 9: 157-171. ![]() |
[2] |
Descent property and global convergence of the Fletcher-Reeves method with inexact line search. IMA J. Numer. Anal. (1985) 5: 121-124. ![]() |
[3] | An unconstrained optimization test functions collection. Adv. Model. Optim. (2008) 10: 147-161. |
[4] |
An acceleration of gradient descent algorithm with backtracking for unconstrained optimization. Numer. Algorithms (2006) 42: 63-73. ![]() |
[5] |
A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues. Numer. Algorithms (2018) 77: 1273-1282. ![]() |
[6] | N. Andrei, Relaxed gradient descent and a new gradient descent methods for unconstrained optimization, Visited August 19, (2018). |
[7] |
N. Andrei, Nonlinear Conjugate Gradient Methods for Unconstrained Optimization, 1st edition, Springer International Publishing, 2020. doi: 10.1007/978-3-030-42950-8
![]() |
[8] |
Minimization of functions having Lipschitz continuous first partial derivatives. Pacific J. Math. (1966) 16: 1-3. ![]() |
[9] |
The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices. European J. Oper. Res. (2014) 234: 625-630. ![]() |
[10] |
B. Baluch, Z. Salleh, A. Alhawarat and U. A. M. Roslan, A new modified three-term conjugate gradient method with sufficient descent property and its global convergence, J. Math., 2017 (2017), Article ID 2715854, 12 pages. doi: 10.1155/2017/2715854
![]() |
[11] |
Two-point step-size gradient method. IMA J. Numer. Anal. (1988) 8: 141-148. ![]() |
[12] |
M. Bastani and D. K. Salkuyeh, On the GSOR iteration method for image restoration, Numerical Algebra, Control and Optimization, (2020). doi: 10.3934/naco.2020013
![]() |
[13] |
An evaluation of quasi-Newton methods for application to FSI problems involving free surface flow and solid body contact. Computers & Structures (2016) 173: 71-83. ![]() |
[14] |
CUTE: Constrained and unconstrained testing environments. ACM Trans. Math. Softw. (1995) 21: 123-160. ![]() |
[15] |
A classification of quasi-Newton methods. Numer. Algorithms (2003) 33: 123-135. ![]() |
[16] |
A conjugate gradient algorithm and its applications in image restoration. Appl. Numer. Math. (2020) 152: 243-252. ![]() |
[17] |
A two-term PRP-based descent method. Numer. Funct. Anal. Optim. (2007) 28: 1217-1230. ![]() |
[18] |
A sufficient descent conjugate gradient method and its global convergence. Optim. Methods Softw. (2016) 31: 577-590. ![]() |
[19] |
Stepsize analysis for descent methods. J. Optim. Theory Appl. (1981) 33: 187-205. ![]() |
[20] |
Convergence of quasi-Newton matrices generated by the symmetric rank one update. Math. Programming (1991) 50: 177-195. ![]() |
[21] |
Y.-H. Dai, Nonlinear Conjugate Gradient Methods, Wiley Encyclopedia of Operations Research and Management Science, (2011). doi: 10.1002/9780470400531.eorms0183
![]() |
[22] | Y.-H. Dai, Alternate Step Gradient Method, Report AMSS–2001–041, Academy of Mathematics and Systems Sciences, Chinese Academy of Sciences, 2001. |
[23] | A nonmonotone conjugate gradient algorithm for unconstrained optimization. J. Syst. Sci. Complex. (2002) 15: 139-145. |
[24] | Y.-H. Dai and R. Fletcher, On the Asymptotic Behaviour of some New Gradient Methods, Numerical Analysis Report, NA/212, Dept. of Math. University of Dundee, Scotland, UK, 2003. |
[25] |
A nonlinear conjugate gradient algorithm with an optimal property and an improved wolfe line search. SIAM. J. Optim. (2013) 23: 296-320. ![]() |
[26] |
New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. (2001) 43: 87-101. ![]() |
[27] |
R-linear convergence of the Barzilai and Borwein gradient method. IMA J. Numer. Anal. (2002) 22: 1-10. ![]() |
[28] | Testing different conjugate gradient methods for large-scale unconstrained optimization. J. Comput. Math. (2003) 21: 311-320. |
[29] |
Another improved Wei–Yao–Liu nonlinear conjugate gradient method with sufficient descent property. Appl. Math. Comput. (2012) 218: 7421-7430. ![]() |
[30] |
A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. (1999) 10: 177-182. ![]() |
[31] |
An efficient hybrid conjugate gradient method for unconstrained optimization. Ann. Oper. Res. (2001) 103: 33-47. ![]() |
[32] | Y. H. Dai and Y. Yuan, A class of Globally Convergent Conjugate Gradient Methods, Research report ICM-98-030, Institute of Computational Mathematics and Scientific/Engineering Computing, Chinese Academy of Sciences, 1998. |
[33] | A class of globally convergent conjugate gradient methods. Sci. China Ser. A (2003) 46: 251-261. |
[34] |
Modified two-point step-size gradient methods for unconstrained optimization. Comput. Optim. Appl. (2002) 22: 103-109. ![]() |
[35] |
Alternate minimization gradient method. IMA J. Numer. Anal. (2003) 23: 377-393. ![]() |
[36] |
Analysis of monotone gradient methods. J. Ind. Manag. Optim. (2005) 1: 181-192. ![]() |
[37] | Y.-H. Dai and H. Zhang, An Adaptive Two-Point Step-size gradient method, Research report, Institute of Computational Mathematics and Scientific/Engineering Computing, Chinese Academy of Sciences, 2001. |
[38] |
The conjugate gradient method for linear and nonlinear operator equations. SIAM J. Numer. Anal. (1967) 4: 10-26. ![]() |
[39] |
S. Delladji, M. Belloufi and B. Sellami, Behavior of the combination of PRP and HZ methods for unconstrained optimization, Numerical Algebra, Control and Optimization, (2020). doi: 10.3934/naco.2020032
![]() |
[40] | Investigation of quasi-Newton methods for unconstrained optimization. International Journal of Computer Application (2010) 29: 48-58. |
[41] |
Two modifications of the method of the multiplicative parameters in descent gradient methods. Appl. Math, Comput. (2012) 218: 8672-8683. ![]() |
[42] |
S. S. Djordjević, Unconstrained Optimization Methods: Conjugate Gradient Methods and Trust-Region Methods, Book Chapter, 2019. doi: 10.5772/intechopen.84374
![]() |
[43] |
Benchmarking optimization software with performance profiles. Math. Program. (2002) 91: 201-213. ![]() |
[44] |
The application of quasi-Nnewton methods in fluid mechanics. Internat. J. Numer. Methods Engrg. (1981) 17: 707-718. ![]() |
[45] | D. K. Faddeev and I. S. Sominskiǐ, Collection of Problems on Higher Algebra. Gostekhizdat, 2nd edition, Moscow, 1949. |
[46] |
A. G. Farizawani, M. Puteh, Y. Marina and A. Rivaie, A review of artificial neural network learning rule based on multiple variant of conjugate gradient approaches, Journal of Physics: Conference Series, 1529 (2020). doi: 10.1088/1742-6596/1529/2/022040
![]() |
[47] | R. Fletcher, Practical Methods of Optimization, Unconstrained Optimization, 1st edition, Wiley, New York, 1987. |
[48] |
Function minimization by conjugate gradients. Comput. J. (1964) 7: 149-154. ![]() |
[49] |
Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. (1992) 2: 21-42. ![]() |
[50] |
On steepest descent. J. SIAM Control Ser. A (1965) 3: 147-151. ![]() |
[51] |
A nonmonotone line search technique for Newton's method. SIAM J. Numer. ANAL. (1986) 23: 707-716. ![]() |
[52] |
A class of nonmonotone stability methods in unconstrained optimization. Numer. Math. (1991) 59: 779-805. ![]() |
[53] |
A truncated Newton method with nonmonotone line search for unconstrained optimization. J. Optim. Theory Appl. (1989) 60: 401-419. ![]() |
[54] |
Nonmonotone globalization techniques for the Barzilai-Borwein gradient method. Comput. Optim. Appl. (2002) 23: 143-169. ![]() |
[55] |
A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. (2005) 16: 170-192. ![]() |
[56] |
Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Software (2006) 32: 113-137. ![]() |
[57] | A survey of nonlinear conjugate gradient methods. Pac. J. Optim. (2006) 2: 35-58. |
[58] | Combining quasi-Newton and Cauchy directions. Int. J. Appl. Math. (2003) 12: 167-191. |
[59] |
A new type of quasi-Newton updating formulas based on the new quasi-Newton equation. Numer. Algebra Control Optim. (2020) 10: 227-235. ![]() |
[60] |
Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Standards (1952) 49: 409-436. ![]() |
[61] |
Global convergence result for conjugate gradient methods. J. Optim. Theory Appl. (1991) 71: 399-405. ![]() |
[62] |
Fixed points by a new iteration method. Proc. Am. Math. Soc. (1974) 44: 147-150. ![]() |
[63] |
B. Ivanov, P. S. Stanimirović, G. V. Milovanović, S. Djordjević and I. Brajević, Accelerated multiple step-size methods for solving unconstrained optimization problems, Optimization Methods and Software, (2019). doi: 10.1080/10556788.2019.1653868
![]() |
[64] |
Applications of the conjugate gradient method in optimal surface parameterizations. Int. J. Comput. Math. (2010) 87: 1032-1039. ![]() |
[65] |
A hybrid conjugate gradient method with descent property for unconstrained optimization. Appl. Math. Model. (2015) 39: 1281-1290. ![]() |
[66] |
S. H. Khan, A Picard-Mann hybrid iterative process, Fixed Point Theory Appl., 2013 (2013), Article number: 69, 10 pp. doi: 10.1186/1687-1812-2013-69
![]() |
[67] | Novel hybrid algorithm in solving unconstrained optimizations problems. International Journal of Novel Research in Physics Chemistry & Mathematics (2017) 4: 36-42. |
[68] |
Implementation of gradient methods for optimization of underage costs in aviation industry. University Thought, Publication in Natural Sciences (2016) 6: 71-74. ![]() |
[69] |
A continuous-time approach to online optimization. J. Dyn. Games (2017) 4: 125-148. ![]() |
[70] |
On a two phase approximate greatest descent method for nonlinear optimization with equality constraints. Numer. Algebra Control Optim. (2018) 8: 315-326. ![]() |
[71] |
A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. (2001) 129: 15-35. ![]() |
[72] |
A modified PRP conjugate gradient algorithm with trust region for optimization problems. Numer. Funct. Anal. Optim. (2011) 32: 496-506. ![]() |
[73] |
A projected preconditioned conjugate gradient method for the linear response eigenvalue problem. Numer. Algebra Control Optim. (2018) 8: 389-412. ![]() |
[74] |
A modified Fletcher-Reeves-type derivative-free method for symmetric nonlinear equations. Numer. Algebra Control Optim. (2011) 1: 71-82. ![]() |
[75] |
Approximate greatest descent in neural network optimization. Numer. Algebra Control Optim. (2018) 8: 327-336. ![]() |
[76] |
J. Liu, S. Du and Y. Chen, A sufficient descent nonlinear conjugate gradient method for solving $M$-tensor equations, J. Comput. Appl. Math., 371 (2020), 112709, 11 pp. doi: 10.1016/j.cam.2019.112709
![]() |
[77] |
A projection method for convex constrained monotone nonlinear equations with applications. Comput. Math. Appl. (2015) 70: 2442-2453. ![]() |
[78] |
Efficient generalized conjugate gradient algorithms, part 1: Theory. J. Optim. Theory Appl. (1991) 69: 129-137. ![]() |
[79] |
A risk minimization problem for finite horizon semi-Markov decision processes with loss rates. J. Dyn. Games (2018) 5: 143-163. ![]() |
[80] |
I. E. Livieris and P. Pintelas, A descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergence, ISRN Computational Mathematics, 2012 (2012), Article ID 435495. doi: 10.5402/2012/435495
![]() |
[81] |
M. Lotfi and S. M. Hosseini, An efficient Dai-Liao type conjugate gradient method by reformulating the CG parameter in the search direction equation, J. Comput. Appl. Math., 371 (2020), 112708, 15 pp. doi: 10.1016/j.cam.2019.112708
![]() |
[82] |
Hybrid approach for solving systems of nonlinear equations using chaos optimization and quasi-Newton method. Applied Soft Computing (2008) 8: 1068-1073. ![]() |
[83] |
Mean value methods in iterations. Proc. Amer. Math. Soc. (1953) 4: 506-510. ![]() |
[84] |
Scalar correction method for solving large scale unconstrained minimization problems. J. Optim. Theory Appl. (2011) 151: 304-320. ![]() |
[85] |
S. K. Mishra and B. Ram, Introduction to Unconstrained Optimization with R, 1st edition, Springer Singapore, Springer Nature Singapore Pte Ltd., 2019. doi: 10.1007/978-981-15-0894-3
![]() |
[86] | I. S. Mohammed, M. Mamat, I. Abdulkarim and F. S. Bt. Mohamad, A survey on recent modifications of conjugate gradient methods, Proceedings of the UniSZA Research Conference 2015 (URC 5), Universiti Sultan Zainal Abidin, 14–16 April 2015. |
[87] |
A brief survey of methods for solving nonlinear least-squares problems. Numer. Algebra Control Optim. (2019) 9: 1-13. ![]() |
[88] | A survey of sufficient descent conjugate gradient methods for unconstrained optimization. SUT J. Math. (2014) 50: 167-203. |
[89] |
J. L. Nazareth, Conjugate-Gradient Methods, In: C. Floudas and P. Pardalos (eds), Encyclopedia of Optimization, 2$^nd$ edition, Springer, Boston, 2009. doi: 10.1007/978-0-387-74759-0
![]() |
[90] |
J. Nocedal and S. J. Wright, Numerical Optimization, Springer, New York, 1999. doi: 10.1007/b98874
![]() |
[91] |
W. F. H. W. Osman, M. A. H. Ibrahim and M. Mamat, Hybrid DFP-CG method for solving unconstrained optimization problems, Journal of Physics: Conf. Series, 890 (2017), 012033. doi: 10.1088/1742-6596/890/1/012033
![]() |
[92] |
Initial improvement of the hybrid accelerated gradient descent process. Bull. Aust. Math. Soc. (2018) 98: 331-338. ![]() |
[93] |
A modified conjugate gradient algorithm. Oper. Res. (1978) 26: 1073-1078. ![]() |
[94] |
An accelerated double step-size method in unconstrained optimization. Applied Math. Comput. (2015) 250: 309-319. ![]() |
[95] |
Determination of accelerated factors in gradient descent iterations based on Taylor's series. University Thought, Publication in Natural Sciences (2017) 7: 41-45. ![]() |
[96] |
Hybridization of accelerated gradient descent method. Numer. Algorithms (2018) 79: 769-786. ![]() |
[97] |
A note on hybridization process applied on transformed double step size model. Numerical Algorithms (2020) 85: 449-465. ![]() |
[98] |
M. J. Petrović and P. S. Stanimirović, Accelerated double direction method for solving unconstrained optimization problems, Math. Probl. Eng., 2014 (2014), Article ID 965104, 8 pages. doi: 10.1155/2014/965104
![]() |
[99] |
M. J. Petrović, P. S. Stanimirović, N. Kontrec and J. Mladenović, Hybrid modification of accelerated double direction method, Math. Probl. Eng., 2018 (2018), Article ID 1523267, 8 pages. doi: 10.1155/2018/1523267
![]() |
[100] |
A new class of efficient and globally convergent conjugate gradient methods in the Dai-Liao family. Optim. Methods Softw. (2015) 30: 843-863. ![]() |
[101] | Memoire sur la theorie des equations aux derivees partielles et la methode des approximations successives. J. Math. Pures Appl. (1890) 6: 145-210. |
[102] | E. Polak and G. Ribière, Note sur la convergence des méthodes de directions conjuguées, Rev. Française Imformat Recherche Opértionelle, 3 (1969), 35–43. |
[103] |
The conjugate gradient method in extreme problems. USSR Comput. Math. and Math. Phys. (1969) 9: 94-112. ![]() |
[104] |
Algorithms for nonlinear constraints that use Lagrangian functions. Math. Programming (1978) 14: 224-248. ![]() |
[105] |
On the Barzilai and Borwein choice of steplength for the gradient method. IMA J. Numer. Anal. (1993) 13: 321-326. ![]() |
[106] |
The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. (1997) 7: 26-33. ![]() |
[107] |
Relaxed steepest descent and Cauchy-Barzilai-Borwein method. Comput. Optim. Appl. (2002) 21: 155-167. ![]() |
[108] |
A new family of conjugate gradient coefficient with application. International Journal of Engineering & Technology (2018) 7: 36-43. ![]() |
[109] |
Convergence of line search methods for unconstrained optimization. Appl. Math. Comput. (2004) 157: 393-405. ![]() |
[110] |
The application of new conjugate gradient methods in estimating data. International Journal of Engineering & Technology (2018) 7: 25-27. ![]() |
[111] |
Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization. Numer. Algebra Control Optim. (2018) 8: 377-387. ![]() |
[112] |
New hybrid conjugate gradient and Broyden-Fletcher-Goldfarb-Shanno conjugate gradient methods. J. Optim. Theory Appl. (2018) 178: 860-884. ![]() |
[113] |
Computation of $\{2, 4\}$ and $\{2, 3\}$-inverses based on rank-one updates. Linear Multilinear Algebra (2018) 66: 147-166. ![]() |
[114] |
Computing $\{2, 4\}$ and $\{2, 3\}$-inverses by using the Sherman-Morrison formula. Appl. Math. Comput. (2015) 273: 584-603. ![]() |
[115] |
Accelerated gradient descent methods with line search. Numer. Algor. (2010) 54: 503-520. ![]() |
[116] |
P. S. Stanimirović, G. V. Milovanović, M. J. Petrović and N. Z. Kontrec, A transformation of accelerated double step-size method for unconstrained optimization, Math. Probl. Eng., 2015 (2015), Article ID 283679, 8 pages. doi: 10.1155/2015/283679
![]() |
[117] |
Global convergence of nonmonotone descent methods for unconstrained optimization problems. J. Comp. Appl. Math. (2002) 146: 89-98. ![]() |
[118] |
Identification of Hessian matrix in distributed gradient based multi agent coordination control systems. Numer. Algebra Control Optim. (2019) 9: 297-318. ![]() |
[119] |
W. Sun and Y.-X. Yuan, Optimization Theory and Methods: Nonlinear Programming, 1st edition, Springer, New York, 2006. doi: 10.1007/b106451
![]() |
[120] |
Non–monotone trust–region algorithm for nonlinear optimization subject to convex constraints. Math. Prog. (1997) 77: 69-94. ![]() |
[121] |
Efficient hybrid conjugate gradient techniques. J. Optim. Theory Appl. (1990) 64: 379-397. ![]() |
[122] |
A class of gradient unconstrained minimization algorithms with adaptive step-size. J. Comp. Appl. Math. (2000) 114: 367-386. ![]() |
[123] |
New quasi-Newton methods for unconstrained optimization problems. Appl. Math. Comput. (2006) 175: 1156-1188. ![]() |
[124] |
New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems. Appl. Math. Comput. (2006) 179: 407-430. ![]() |
[125] |
The convergence properties of some new conjugate gradient methods. Appl. Math. Comput. (2006) 183: 1341-1350. ![]() |
[126] |
Convergence conditions for ascent methods. SIAM Rev. (1969) 11: 226-235. ![]() |
[127] |
Survey of derivative-free optimization. Numer. Algebra Control Optim. (2020) 10: 537-555. ![]() |
[128] |
A conjugate gradient method to solve convex constrained monotone equations with applications in compressive sensing. J. Math. Anal. Appl. (2013) 405: 310-319. ![]() |
[129] |
Global convergence properties of nonlinear conjugate gradient methods with modified secant condition. Comput. Optim. Appl. (2004) 28: 203-225. ![]() |
[130] |
X. Yang, Z. Luo and X. Dai, A global convergence of LS-CD hybrid conjugate gradient method, Adv. Numer. Anal., 2013 (2013), Article ID 517452, 5 pp. doi: 10.1155/2013/517452
![]() |
[131] |
S. Yao, X. Lu and Z. Wei, A conjugate gradient method with global convergence for large-scale unconstrained optimization problems, J. Appl. Math., 2013 (2013), Article ID 730454, 9 pp. doi: 10.1155/2013/730454
![]() |
[132] |
Convergence analysis of a modified BFGS method on convex minimizations. Comp. Optim. Appl. (2010) 47: 237-255. ![]() |
[133] |
A notes about WYL's conjugate gradient method and its applications. Appl. Math. Comput. (2007) 191: 381-388. ![]() |
[134] | A new stepsize for the steepest descent method. J. Comput. Math. (2006) 24: 149-156. |
[135] |
A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems. Appl. Numer. Math. (2020) 147: 129-141. ![]() |
[136] |
G. Yuan, T. Li and W. Hu, A conjugate gradient algorithm and its application in large-scale optimization problems and image restoration, J. Inequal. Appl., 2019 (2019), Article number: 247, 25 pp. doi: 10.1186/s13660-019-2192-6
![]() |
[137] |
Modified limited memory BFGS method with nonmonotone line search for unconstrained optimization. J. Korean Math. Soc. (2010) 47: 767-788. ![]() |
[138] |
An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation. Appl. Math. Comput. (2009) 215: 2269-2274. ![]() |
[139] |
Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems. J. Optim. Theory Appl. (2017) 175: 502-509. ![]() |
[140] |
Semi-local convergence of the Newton-HSS method under the center Lipschitz condition. Numer. Algebra Control Optim. (2019) 9: 85-99. ![]() |
[141] |
A nonlinear conjugate gradient method based on the MBFGS secant condition. Optim. Methods Softw. (2006) 21: 707-714. ![]() |
[142] | G. Zoutendijk, Nonlinear programming, computational methods, In: J. Abadie (eds.): Integer and Nonlinear Programming, North-Holland, Amsterdam, (1970), 37–86. |
[143] | A new gradient method for solving linear regression model. International Journal of Recent Technology and Engineering (2019) 7: 624-630. |