Journal of the Operations Research Society of China ›› 2022, Vol. 10 ›› Issue (2): 305-342.doi: 10.1007/s40305-021-00346-9

• • 上一篇    下一篇

  

  • 收稿日期:2020-08-29 修回日期:2021-01-22 出版日期:2022-06-30 发布日期:2022-06-13

Augmented Lagrangian Methods for Convex Matrix Optimization Problems

Ying Cui1, Chao Ding2, Xu-Dong Li3,4, Xin-Yuan Zhao5   

  1. 1 Department of Industrial and Systems Engineering, University of Minnesota, Minneapolis 55411, USA;
    2 Institute of Applied Mathematics, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190, China;
    3 School of Data Science, Fudan University, Shanghai 200433, China;
    4 Shanghai Center for Mathematical Sciences, Fudan University, Shanghai 200433, China;
    5 College of Applied Sciences, Beijing University of Technology, Beijing 100124, China
  • Received:2020-08-29 Revised:2021-01-22 Online:2022-06-30 Published:2022-06-13
  • Contact: Xu-Dong Li, Ying Cui, Chao Ding, Xin-Yuan Zhao E-mail:lixudong@fudan.edu.cn;yingcui@umn.edu;dingchao@amss.ac.cn;xyzhao@bjut.edu.cn
  • Supported by:
    Chao Ding’s research was supported by the National Natural Science Foundation of China (Nos. 11671387, 11531014, and 11688101) and Beijing Natural Science Foundation (No. Z190002). Xu-Dong Li’s research was supported by the National Key R&D Program of China (No. 2020YFA0711900), the National Natural Science Foundation of China (No. 11901107), the Young Elite Scientists Sponsorship Program by CAST (No. 2019QNRC001), the Shanghai Sailing Program (No. 19YF1402600), and the Science and Technology Commission of Shanghai Municipality Project (No. 19511120700). Xin-Yuan Zhao’s research was supported by the National Natural Science Foundation of China (No. 11871002) and the General Program of Science and Technology of Beijing Municipal Education Commission (No. KM201810005004).

Abstract: In this paper, we provide some gentle introductions to the recent advance in augmented Lagrangian methods for solving large-scale convex matrix optimization problems (cMOP). Specifically, we reviewed two types of sufficient conditions for ensuring the quadratic growth conditions of a class of constrained convex matrix optimization problems regularized by nonsmooth spectral functions. Under a mild quadratic growth condition on the dual of cMOP, we further discussed the R-superlinear convergence of the Karush-Kuhn-Tucker (KKT) residuals of the sequence generated by the augmented Lagrangian methods (ALM) for solving convex matrix optimization problems. Implementation details of the ALM for solving core convex matrix optimization problems are also provided.

Key words: Matrix optimization, Spectral functions, Quadratic growth conditions, Metric subregularity, Augmented Lagrangian methods, Fast convergence rates, Semismooth Newton methods

中图分类号: