Journal of the Operations Research Society of China ›› 2021, Vol. 9 ›› Issue (3): 543-568.doi: 10.1007/s40305-021-00351-y

• • 上一篇    下一篇

  

  • 收稿日期:2020-02-18 修回日期:2020-10-27 出版日期:2021-09-30 发布日期:2021-09-26

Sparse Estimation of High-Dimensional Inverse Covariance Matrices with Explicit Eigenvalue Constraints

Yun-Hai Xiao1, Pei-Li Li2, Sha Lu3   

  1. 1 Institute of Applied Mathematics, College of Mathematics and Statistics, Henan University, Kaifeng 475000, Henan, China;
    2 School of Statistics, Key Laboratory of Advanced Theory and Application in Statistics and Data Science-MOE, East China Normal University, Shanghai 200062, China;
    3 Department of Mathematics, Nanning Normal University, Nanning 530299, Guangxi, China
  • Received:2020-02-18 Revised:2020-10-27 Online:2021-09-30 Published:2021-09-26
  • Contact: Yun-Hai Xiao,yhxiao@henu.edu.cn;Pei-Li Li,plli@sfs.ecnu.edu.cn;Sha Lu,lusha_nn@126.com E-mail:yhxiao@henu.edu.cn;plli@sfs.ecnu.edu.cn;lusha_nn@126.com
  • Supported by:
    The work of Yun-Hai Xiao is supported by the National Natural Science Foundation of China (No. 11971149).

Abstract: Firstly, this paper proposes a generalized log-determinant optimization model with the purpose of estimating the high-dimensional sparse inverse covariance matrices. Under the normality assumption, the zero components in the inverse covariance matrices represent the conditional independence between pairs of variables given all the other variables. The generalized model considered in this study, because of the setting of the eigenvalue bounded constraints, covers a large number of existing estimators as special cases. Secondly, rather than directly tracking the challenging optimization problem, this paper uses a couple of alternating direction methods of multipliers (ADMM) to solve its dual model where 5 separable structures are contained. The first implemented algorithm is based on a single Gauss-Seidel iteration, but it does not necessarily converge theoretically. In contrast, the second algorithm employs the symmetric Gauss-Seidel (sGS) based ADMM which is equivalent to the 2-block iterative scheme from the latest sGS decomposition theorem. Finally, we do numerical simulations using the synthetic data and the real data set which show that both algorithms are very effective in estimating high-dimensional sparse inverse covariance matrix.

Key words: Non-smooth convex minimization, Inverse covariance matrix, Maximum likelihood estimation, Augmented Lagrangian function, Symmetric Gauss-Seidel iteration

中图分类号: