A Variable Metric Extrapolation Proximal Iterative Hard Thresholding Method

Expand
  • 1 School of Mathematics and Computer Science, Shanxi Normal University, Taiyuan 030002, Shanxi, China;
    2 Department of Mathematics, Shanghai Jiao Tong University, Shanghai 200240, China;
    3 Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai 200240, China

Received date: 2022-04-07

  Revised date: 2023-01-14

  Online published: 2025-03-20

Abstract

In this paper, we propose a variable metric extrapolation proximal iterative hard thresholding (VMEPIHT) method for nonconvex $\ell_0$-norm sparsity regularization problem which has wide applications in signal and image processing, machine learning and so on. The VMEPIHT method is based on the forward-backward splitting (FBS) method, and variable metric strategy is employed in the extrapolation step to speed up the algorithm. The proposed method’s convergence, linear convergence rate and superlinear convergence rate are shown under appropriate assumptions. Finally, we conduct numerical experiments on compressed sensing problem and CT image reconstruction problem to confirm the efficiency of the proposed method, compared with other state-of-the-art methods.

Cite this article

Xue Zhang, Xiao-Qun Zhang . A Variable Metric Extrapolation Proximal Iterative Hard Thresholding Method[J]. Journal of the Operations Research Society of China, 2025 , 13(1) : 161 -183 . DOI: 10.1007/s40305-023-00459-3

References

[1] Zhang, Y., Dong, B., Lu, Z.: 0 Minimization for wavelet frame based image restoration. Math. Comp. 82(282), 995-1015(2012)
[2] He, L., Wang, Y., Xiang, Z.: Wavelet frame-based image restoration using sparsity, nonlocal, and support prior of frame coefficients. Visual Comput. 35(2), 151-174(2019)
[3] Gong, M., Liu, J., Li, H., Cai, Q., Su, L.: A multiobjective sparse feature learning model for deep neural networks. IEEE T. Neur. Net. Lear. 26(12), 3263-3277(2017)
[4] Wang, C., Zeng, L., Zhang, L., Guo, Y., Yu, W.: An adaptive iteration reconstruction method for limited-angle CT image reconstruction. J. Inverse Ill-Posed Probl. 26(6), (2018)
[5] Lu, X., Wang, Y., Yuan, Y.: Sparse coding from a Bayesian perspective. IEEE T. Neur. Net. Lear. 24(6), 929-939(2013)
[6] Kiefer, L., Storath, M., Weinmann, A.: Iterative potts minimization for the recovery of signals with discontinuities from indirect measurements: the multivariate case. Found. Comput. Math. 21, 649-694(2021)
[7] Chen, J., Zhang, M., Li, Y.: A reconstruction algorithm for electrical capacitance tomography via total variation and 0-norm regularizations using experimental data. arXiv:1711.02544(2017)
[8] Lions, P.L., Mercier, B.: Splitting algorithms for the sum of two nonlinear operators. SIAM J. Numer. Anal. 16(6), 964-979(1979)
[9] Passty, G.B.: Ergodic convergence to a zero of the sum of monotone operators in Hilbert space. J. Math. Anal. Appl. 72(2), 383-390(1979)
[10] Attouch, H., Bolte, J., Svaiter, B.F.: Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods. Math. Program. 137(1), 91-129(2013)
[11] Lu, Z.: Iterative hard thresholding methods for 0 regularized convex cone programming. Math. Program. 147(1-2), 125-154(2013)
[12] Jiao, Y., Jin, B., Lu, X.: Iterative soft/hard thresholding with homotopy continuation for sparse recovery. IEEE Signal Proc. Let. 24(6), 784-788(2017)
[13] Bot, R.I., Csetnek, E.R., László, S.C.: An inertial forward-backward algorithm for the minimization of the sum of two nonconvex functions. EURO J. Comput. Optim. 4(1), 3-25(2016)
[14] Liang, J., Fadili, J., Peyré, G.: A multi-step inertial forward-backward splitting method for non-convex optimization. arXiv:1606.02118(2016)
[15] Li, H., Lin, Z.: Accelerated proximal gradient methods for nonconvex programming. In: Proceedings of the 28th International Conference on Neural Information Processing Systems. MIT Press 1,379-387(2015)
[16] Gu, B., Huo, Z., Huang, H.: Inexact proximal gradient methods for non-convex and non-smooth optimization. arXiv:1612.06003v2(2018)
[17] Yao, Q., Kwok, J.T., Gao, F., Chen, W., Liu, T.Y.: Efficient inexact proximal gradient algorithm for nonconvex problems. In: Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligenc. (2017)
[18] Zhang, X., Zhang, X.: A new proximal iterative hard thresholding method with extrapolation for 0 minimization. J. Sci. Comput. 79(2), 809-826(2019)
[19] Attouch, H., Peypouquet, J.: The rate of convergence of Nesterov’s accelerated forward-backward method is actually faster than 1/k2. SIAM J. Optim. 26(3), 1824-1834(2015)
[20] Bonettini, S., Porta, F., Ruggiero, V.: A variable metric forward-backward method with extrapolation. SIAM J. Sci. Comput. 38(4), A2558-A2584(2016)
[21] Frankel, P., Garrigos, G., Peypouquet, J.: Splitting methods with variable metric for Kurdyka- Łojasiewicz functions and general convergence rates. J. Optimiz. Theory App. 165(3), 874-900(2015)
[22] Bonettini, S., Loris, I., Porta, F., Prato, M., Rebegoldi, S.: On the convergence of variable metric linesearch based proximal-gradient method under the Kurdyka-Łojasiewicz inequality. Inverse Probl. 33(5), (2016)
[23] Salzo, S.: The variable metric forward-backward splitting algorithm under mild differentiability assumptions. SIAM J. Optim. 27(4), 2153-2181(2017)
[24] Chouzenoux, E., Pesquet, J.C., Repetti, A.: A block coordinate variable metric forward-backward algorithm. J. Global Optim. 66(3), 457-485(2016)
[25] Bonettini, S., Porta, F., Ruggiero, V., Zanni, L.: Variable metric techniques for forward-backward methods in imaging. J. Comput. Appl. Math. 385, (2021)
[26] Ochs, P.: Unifying abstract inexact convergence theorems and block coordinate variable metric iPiano. SIAM J. Optim. 29(1), 541-570(2019)
[27] Bonettini, S., Prato, M.: New convergence results for the scaled gradient projection method. Inverse Probl. 31(9), (2015)
[28] Porta, F., Prato, M., Zanni, L.: A new steplength selection for scaled gradient methods with application to image deblurring. J. Sci. Comput. 65(3), 895-919(2015)
[29] Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2(1), 183-202(2009)
[30] Bertsekas, D.: Convex Optimization Theory. Athena Scientific, Belmont (2009)
[31] Zhang, X., Zhang, X.: A note on the complexity of proximal iterative hard thresholding algorithm. J. Oper. Res. Soc. China 3(4), 459-473(2015)
[32] Nocedal, J., Wright, S.J.: Numerical Optimization Second Edition. World Scientific (1999)
[33] Dong, Y.: New step lengths in conjugate gradient methods. Comput. Math. Appl. 60(3), 563-571(2010)
[34] Polyak, B.: Introduction to optimization. Chapman and Hall (1987)
Options
Outlines

/