[1] Fletcher, R., Reeves, C.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154(1964)
[2] Fletcher, R.: Practical Methods of Optimization. Wiley, Hoboken (2013)
[3] Dai, Y.H., Yuan, Y.X.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182(1999)
[4] Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42(1992)
[5] Liu, Y.L., Storey, C.S.: Efficient generalized conjugate gradient algorithms. Part 1: theory. J. Optim. Theory Appl. 69(1), 129–137(1991)
[6] Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49(6), 409–436(1952)
[7] Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58(2006)
[8] Dai,Y.H.: Conjugate gradient methods with Armijo-type line searches. Acta Math. Appl. Sin. (English Series) 18(1), 123–130(2002)
[9] Dai, Y.H., Liu, X.W.: Advances in linear and nonlinear programming. Oper. Res. Trans. (Chin. Ser.) 18(1), 69–92(2014)
[10] Perry, A.: A modified conjugate gradient algorithm. Oper. Res. 26(6), 1073–1078(1976)
[11] Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43(1), 87–101(2001)
[12] Zhang, J.Z., Deng, N.Y., Chen, L.H.: New quasi-Newton equation and related methods for unconstrained optimization. J. Optim. Theory Appl. 102(1), 147–167(1999)
[13] Li, G., Tang, C., Wei, Z.: New conjugacy condition and related new conjugate gradient methods for unconstrained optimization. J. Comput. Appl. Math. 202(2), 523–539(2007)
[14] Ford, J.A., Narushima, Y., Yabe, H.: Multi-step nonlinear conjugate gradient methods for unconstrained minimization. Comput. Optim. Appl. 40(2), 191–216(2008)
[15] Zhou, W., Zhang, L.: A nonlinear conjugate gradient method based on the MBFGS secant condition. Optim. Methods Softw. 21(5), 707–714(2006)
[16] Babaie-Kafaki, S., Reza, G.: A descent family of Dai–Liao conjugate gradient methods. Optim. Methods Softw. 29(3), 583–591(2014)
[17] Babaie-Kafaki, S., Reza, G.: The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234(3), 625–630(2014)
[18] Babaie-Kafaki, S., Ghanbari, R.: Two modified three-term conjugate gradient methods with sufficient descent property. Optim. Lett. 8(8), 2285–2297(2014)
[19] Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192(2005)
[20] Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320(2013)
[21] Yuan, Y.X., Stoer, J.: A subspace study on conjugate gradient algorithms. ZAMM J. Appl. Math. Mech. 75(1), 69–77(1995)
[22] Dai, Y.H., Kou, C.X.: A Barzilai–Borwein conjugate gradient method. Sci. China Math. 59(8), 1511–1524(2016)
[23] Kou, C.X.: An improved nonlinear conjugate gradient method with an optimal property. Sci. China Math. 57(3), 635–648(2014)
[24] Kou, C.X., Dai, Y.H.: A modified self-scaling memoryless Broyden–Fletcher–Goldfarb–Shanno method for unconstrained optimization. J. Optim. Theory Appl. 165(1), 209–224(2015)
[25] Dong,X.,Liu,H.,He,Y.: A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition. J. Optim. Theory Appl. 165(1), 225–241(2015)
[26] Dong, X., Han, D., Dai, Z., Li, X., Zhu, J.: An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition. J. Optim. Theory Appl. 179(3), 944–961(2018)
[27] Dong, X., Liu, H., He, Y.: A modified Hestenes–Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition. J. Comput. Appl. Math. 281, 239–249(2015)
[28] Zhang, L., Zhou, W.J., Li, D.H.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22(4), 697–711(2007)
[29] Cheng, W.Y., Li, D.H.: An active set modified Polak-Ribière–Polyak method for large-scale nonlinear bound constrained optimization. J. Optim. Theory Appl. 155(3), 1084–1094(2012)
[30] Narushima, Y., Yabe, H., Ford, J.A.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21(1), 212–230(2011)
[31] Andrei, N.: A simple three-term conjugate gradient algorithm for unconstrained optimization. J. Comput. Appl. Math. 241, 19–29(2013)
[32] Andrei, N.: On three-term conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 219(11), 6316–6327(2013)
[33] Andrei, N.: Another conjugate gradient algorithm with guaranteed descent and the conjugacy conditions for large-scaled unconstrained optimization. J. Optim. Theory Appl. 159(3), 159–182(2013)
[34] Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213(2002)
[35] Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11(2), 226–235(1969)