[1] Bilmes, J.A.:Factored sparse inverse covariance matrics. In:IEEE Iternational Conference on Acoustics, Speech and Signal Processing 2, 1009-1012(2000)
[2] Yu, H., Dauwels, J., Wang, X.:Copula Gaussian graphical models with hidden variables. In:IEEE Iternational conference on Acoustics, Speech and Signal Processing 22(10), 2177-2180(2012)
[3] Dobra, A., Hans, C., Jones, B., Nevins, J.R., Yao, G., West, M.:Sparse graphical models for exploring gene expression data. J. Multivar. Anal. 90(1), 196-212(2004)
[4] Krämer, N., Schäfer, J., Boulesteix, A.L.:Regularized estimation of large-scale gene association networks using graphical Gaussian models. Bmc Bioinf. 10(1), 1-24(2009)
[5] Kolar, M., Song, L., Ahmed, A., Xing, E.P.:Estimating time-varying networks. Ann. Appl. Stat. 4(1), 94-123(2010)
[6] Dempster, A.P.:Covariance selection. Biometrics 28(1), 157-175(1972)
[7] Meinshausen, N., Bühlmann, P.:High-dimensional graphs and variable selection with the Lasso. Ann. Stat. 34(3), 1436-1462(2006)
[8] Li, H., Gui, J.:Gradient directed regularization for sparse Gaussian concentration graphs, with applications to inference of genetic networks. Biostatistics 7(2), 302-317(2006)
[9] Huang, J.Z., Liu, N., Pourahmadi, M., Liu, L.:Covariance matrix selection and estimation via penalised normal likelihood. Biometrika 93(1), 85-98(2006)
[10] Dahl,J.,Vandenberghe,L.,Roychowdhury,V.:Covarianceselectionfornonchordalgraphsviachordal embedding. Optim. Methods Softw. 23(4), 501-520(2008)
[11] Banerjee, O., El Ghaoui, L., D'Aspremont, A.:Model selection through sparse maximum likelihood estimation for multivariate Gaussian or binary data. J. Mach. Learn. Res. 9(3), 485-516(2008)
[12] Yuan, M., Lin, Y.:Model selection and estimation in regression with grouped variables. J. R. Stat. Soc. Ser. B 68(1), 49-67(2006)
[13] Mazumder, R., Hastie, T.:The graphical lasso:new insights and alternatives. Electron. J. Stat. 6, 2125-2149(2012)
[14] Friedman, J., Hastie, T., Tibshirani, R.:Sparse inverse covariance estimation with the graphical lasso. Biostatistics 9(3), 432-441(2008)
[15] Witten, D.M., Friedman, J.H., Simon, N.:New insights and faster computations for the graphical lasso. J. Comput. Gr. Stat. 20(4), 892-900(2011)
[16] Ravikumar, P., Wainwright, M.J., Raskutti, G., Yu, B.:High-dimensional covariance estimation by minimizing 1-penalized log-determinant divergence. Electron. J. Stat. 2011(5), 935-980(2011)
[17] Lu, Z.:Adaptive first-order methods for general sparse inverse covariance selection. SIAM J. Matrix Anal. Appl. 31(4), 2000-2016(2010)
[18] D'Aspremont, A., Banerjee, O., El Ghaoui, L.:First-order methods for sparse covariance selection. SIAM J. Matrix Anal. Appl. 30(1), 56-66(2006)
[19] Lu, Z.:Smooth optimization approach for sparse covariance selection. SIAM J. Optim. 19(4), 1807-1827(2009)
[20] Chen, C., Ma, S., Yang, J.:A general inertial proximal point method for mixed variational inequality problem. SIAM J. Optim. 25(4), 2120-2142(2015)
[21] Danaher, P., Wang, P., Witten, D.:The joint graphical lasso for inverse covariance estimation across multiple classes. J. R. Stat. Soc. Ser. B 76(2), 373-397(2014)
[22] Ma, S., Xue, L., Zou, H.:Alternating direction methods for latent variable Gaussian graphical model selection. Neural Comput. 25(8), 2172-2198(2013)
[23] Scheinberg, K., Ma, S., Goldfarb, D.:Sparse inverse covariance selection via alternating linearization methods. In:International Conference on Neural Information Processing Systems (NIPS), 2101-2109(2010)
[24] Yang, J., Sun, D.F., Toh, K.-C.:A proximal point algorithm for log-determinant optimization with group lasso regularization. SIAM J. Optim. 23(2), 857-893(2013)
[25] Li, X.D., Sun, D.F., Toh, K.-C.:A block symmetric Gauss-Seidel decomposition theorem for convex composite quadratic programming and its applications. Math. Progr. 175(1), 395-418(2018)
[26] Liu, H., Wang, L., Zhao, T.:Sparse covariance matrix estimation with eigenvalue constraints. J. Comput. Gr. Stat. 23(2), 439-459(2014)
[27] Li, P., Xiao, Y.:An efficient algorithm for sparse inverse covariance matrix estimation based on dual formulation. Comput. Stat. Data Anal. 128, 292-307(2018)
[28] Li, X.D., Sun, D.F., Toh, K.-C.:A schur complement based semi-proximal ADMM for convex quadratic conic programming and extensions. Math. Progr. 155(1-2), 333-373(2016)
[29] Chen, C., He, B., Ye, Y., Yuan, X.:The direct extension of ADMM for multi-block convex minimization problems is not necessarily convergent. Math. Progr. 155(1-2), 57-79(2016)
[30] Deng, W., Yin, W.:On the global and linear convergence of the generalized alternating direction method of multipliers. J. Sci. Comput. 66(3), 889-916(2016)
[31] Han, D., Sun, D., Zhang, L.:Linear rate convergence of the alternating direction method of multipliers for convex composite quadratic and semi-definite programming. Math. Oper. Res. 43(2), 622-637(2018)
[32] Yuan, X., Zhang, J.:Discerning the linear convergence of ADMM for structured convex optimization through the lens of variational analysis. J. Mach. Learn. Res. 21, 1-74(2020)
[33] Fazel, M., Pong, T.K., Sun, D.F., Tseng, P.:Hankel matrix rank minimization with applications in system identification and realization. SIAM J. Matrix Anal. Appl. 34(3), 946-977(2012)
[34] Li, L., Toh, K.-C.:An inexact interior point method for l1-regularized sparse covariance selection. Math. Progr. Comput. 2(3-4), 291-315(2010)
[35] Matthews,B.W.:Comparison of the predicted and observed secondary structure of T4 phage lysozyme. Biochimica et Biophysica Acta (BBA) Protein Struct. 405(2), 442-451(1975)
[36] Fan, J., Feng, Y., Wu, Y.:Network exploration via the adaptive lasso and SCAD penalties. Ann. Appl. Stat. 3(2), 521-541(2009)
[37] Yuan, M., Lin, Y.:Model selection and estimation in the Gaussian graphical model. Biometrika 94(1), 19-35(2007)
[38] Dolan, E.D., Moré, J.J.:Benchmarking optimization software with performance profiles. Math. Progr. 91, 201-213(2002)
[39] Li,X.D.,Sun,D.F.,Toh,K.-C.:QSDPNAL:atwo-phaseNewton-CGproximalaugmentedLagrangian method for convex quadratic semidefinite programming. Math. Progr. Comput. 10(4), 703-743(2018)