Journal of the Operations Research Society of China ›› 2023, Vol. 11 ›› Issue (2): 347-369.doi: 10.1007/s40305-019-00276-7

• • 上一篇    下一篇

  

  • 收稿日期:2018-10-30 修回日期:2019-10-11 出版日期:2023-06-30 发布日期:2023-05-24
  • 通讯作者: Xian-Tao Xiao, Jian Gu E-mail:xtxiao@dlut.edu.cn;gujian@dlou.edu.cn

A Framework of Convergence Analysis of Mini-batch Stochastic Projected Gradient Methods

Jian Gu1, Xian-Tao Xiao2   

  1. 1 School of Sciences, Dalian Ocean University, Dalian 116023, China;
    2 School of Mathematical Sciences, Dalian University of Technology, Dalian 116023, China
  • Received:2018-10-30 Revised:2019-10-11 Online:2023-06-30 Published:2023-05-24
  • Contact: Xian-Tao Xiao, Jian Gu E-mail:xtxiao@dlut.edu.cn;gujian@dlou.edu.cn
  • Supported by:
    the National Natural Science Foundation of China (Nos. 11871135 and 11801054) and the Fundamental Research Funds for the Central Universities (No. DUT19K46)

Abstract: In this paper, we establish a unified framework to study the almost sure global convergence and the expected convergencerates of a class ofmini-batch stochastic(projected) gradient (SG) methods, including two popular types of SG: stepsize diminished SG and batch size increased SG. We also show that the standard variance uniformly bounded assumption, which is frequently used in the literature to investigate the convergence of SG, is actually not required when the gradient of the objective function is Lipschitz continuous. Finally, we show that our framework can also be used for analyzing the convergence of a mini-batch stochastic extragradient method for stochastic variational inequality.

Key words: Stochastic projected gradient method, Variance uniformly bounded, Convergence analysis

中图分类号: