[1] Loosli, G., Canu, S.: chap. 5, pp. 111-135. John Wiley & Sons, Ltd (2010)
[2] Bottou, L., Curtis, F.E., Nocedal, J.: Optimization methods for large-scale machine learning. SIAM Rev. 60(2), 223-311(2018)
[3] Sun, C., Wang, Y.: Gravity-magnetic cross-gradient joint inversion by the cyclic gradient method. Optim. Method Softw. 35(5), 982-1001(2020)
[4] Ramillien, G., Frappart, F., Gratton, S., Vasseur, X.: Sequential estimation of surface water mass changes from daily satellite gravimetry data. J. Geod. 89(3), 259-282(2015)
[5] Larnier, S., Fehrenbach, J., Masmoudi, M.: The topological gradient method: From optimal design to image processing. Milan J. Math. 80(2), 411-441(2012)
[6] De Asmundis, R., Di Serafino, D., Hager, W.W., Toraldo, G., Zhang, H.: An efficient gradient method using the yuan steplength. Comput. Optim. Appl. 59(3), 541-563(2014)
[7] Huang, Y., Liu, H., Zhou, S.: Quadratic regularization projected Barzilai-Borwein method for nonnegative matrix factorization. Data Min. Knowl. Disc. 29(6), 1665-1684(2015)
[8] Jiang, B., Dai, Y.: Feasible Barzilai-Borwein-like methods for extreme symmetric eigenvalue problems. Optim. Method Softw. 28(4), 756-784(2013)
[9] Cauchy, A.: Méthode générale pour la résolution des systemes d’équations simultanées. Comput. Rend. Sci. Paris. 25, 536-538(1847)
[10] Akaike, H.: On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method. Ann. Inst. Stat. Math. Tokyo 11, 1-16(1959)
[11] Barzilai, J., Borwein, J.: Two-point step size gradient methods. IMA J. Numer. Anal. 8(1), 141-148(1988)
[12] Dai, Y., Yuan, Y.: Alternate minimization gradient method. IMA J. Numer. Anal. 23(3), 377-393(2003)
[13] Raydan, M.: On the BarzilaiandBorwein choice of steplength for the gradient method. IMA J. Numer. Anal. 13(3), 321-326(1993)
[14] Dai, Y., Liao, L.: R-linear convergence of the BarzilaiandBorwein gradient method. IMA J. Numer. Anal. 22(1), 1-10(2002)
[15] Dai, Y.: A new analysis on the Barzilai-Borwein gradient method. J. Oper. Res. Soc. China 1(2), 187-198(2013)
[16] Dai, Y.H., Hager, W.W., Schittkowski, K., Zhang, H.C.: The cyclic Barzilai-Borwein method for unconstrained optimization. IMA J. Numer. Anal. 26(3), 604-627(2006)
[17] Yuan, Y.: A new stepsize for the steepest descent method. J. Comput. Math. 24(2), 149-156(2006)
[18] Dai, Y., Yuan, Y.: Analysis of monotone gradient methods. J. Ind. Manag. Optim. 1(2), 181-192(2005)
[19] Dai, Y.: Alternate step gradient method. Optimization 52(4-5), 395-415(2003)
[20] Zhou, B., Gao, L.: Gradient methods with adaptive step-sizes. Comput. Optim. Appl. 35(1), 69-86(2006)
[21] Frassoldati, G., Zanni, L., Zanghirati, G.: New adaptive stepsize selections in gradient methods. J. Ind. Manag. Optim. 4(2), 299-312(2008)
[22] Fletcher, R.: A limited memory steepest descent method. Math. Program. 135, 413-336(2012)
[23] Gu, R., Du, Q.: A modified limited memory steepest descent method motivated by an inexact superlinear convergence rate analysis. IMA J. Numer. Anal. 41, 247-270(2020)
[24] De Asmundis, R., di Serafino, D., Hager, W., Zhang, H.: An efficient gradient method using the yuan steplength. Comput. Optim. Appl. 59(3), 541-563(2014)
[25] Sun, C., Liu, J.: New stepsizes for the gradient method. Optim. Lett. 14(4-5) (2020)
[26] Huang, Y., Dai, Y., Liu, X., Zhang, H.: Gradientmethods exploiting spectral properties. Optim.Method Softw. 35(4), 1-25(2020)
[27] Huang, Y., Liu, H.: Smoothing projected Barzilai-Borwein method for constrained non-lipschitz optimization. Comput. Optim. Appl. 65(3), 671-698(2016)
[28] Huang, Y., Dai, Y., Liu, X.: Equipping Barzilai-Borwein method with two dimensional quadratic termination property. SIAM J. Optim. 31(4), 3068-3096(2021)
[29] Liu, Z., Liu, H., Dong, X.: An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem. Optimization 67(3), 427-440(2018)
[30] Liu, Z., Liu, H.: An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization. J. Optimiz. Theory. App. 181(2), 608-633(2019)
[31] Yuan, Y.: Step-sizes for the gradient method. Ams. Ip. Stud. Adv. Math. 42(2), 785-796(2008)
[32] De Asmundis, R., di Serafino, D., Riccio, F., Toraldo, G.: On spectral properties of steepest descent methods. IMA J. Numer. Anal. 33(4), 1416-1435(2013)
[33] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for newton’s method. SIAM J. Numer. Anal. 23(4), 707-716(1986)
[34] Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10(1), 147- 161(2008)
[35] Huang, Y., Liu, H.: On the rate of convergence of projected Barzilai-Borwein methods. Optim. Method Softw. 30(4), 880-892(2015)
[36] Raydan, M.: The BarzilaiandBorwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7(1), 26-33(1997)
[37] Toint, P.: Some numerical results using a sparse matrix updating formula in unconstrained optimization. Math. Comput. 32(143), 839-851(1978)
[38] Moré, J., Garbow, B., Hillstrom, K.: Testing unconstrained optimization software. ACM Trans. Math. Software 7, 17-41(1981)
[39] Bongartz,I., Conn,A.,Gould,N.,Toint, P.: CUTE: constrained and unconstrained testing environment. ACM Trans. Math. Softw. 21(1), 123-160(1995)
[40] Tibshirani, Ryan, J.: The lasso problem and uniqueness. Electron. J. Stat. 7(1), 1456-1490(2013)
[41] di Serafino, D., Ruggiero, V., Toraldo, G., Zanni, L.: On the steplength selection in gradient methods for unconstrained optimization. Appl. Math. Comput. 318, 176-195(2018)
[42] Dai, Y., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177-182(1999)