Newton-Type Optimal Thresholding Algorithms for Sparse Optimization Problems

Expand
  • 1. School of Mathematics, University of Birmingham, Edgbaston, Birmingham B15 2TT, UK;
    2. Shenzhen Research Institute of Big Data, Chinese University of Hong Kong, Shenzhen 518172, Guangdong, China

Received date: 2021-04-06

  Revised date: 2021-09-08

  Online published: 2022-09-06

Supported by

The work was founded by the National Natural Science Foundation of China (No.12071307).

Abstract

Sparse signals can be possibly reconstructed by an algorithm which merges a traditional nonlinear optimization method and a certain thresholding technique.Different from existing thresholding methods,a novel thresholding technique referred to as the optimal k-thresholding was recently proposed by Zhao (SIAM J Optim 30(1):31-55,2020).This technique simultaneously performs the minimization of an error metric for the problem and thresholding of the iterates generated by the classic gradient method.In this paper,we propose the so-called Newton-type optimal k-thresholding (NTOT) algorithm which is motivated by the appreciable performance of both Newton-type methods and the optimal k-thresholding technique for signal recovery.The guaranteed performance (including convergence) of the proposed algorithms is shown in terms of suitable choices of the algorithmic parameters and the restricted isometry property (RIP) of the sensing matrix which has been widely used in the analysis of compressive sensing algorithms.The simulation results based on synthetic signals indicate that the proposed algorithms are stable and efficient for signal recovery.

Cite this article

Nan Meng, Yun-Bin Zhao . Newton-Type Optimal Thresholding Algorithms for Sparse Optimization Problems[J]. Journal of the Operations Research Society of China, 2022 , 10(3) : 447 -469 . DOI: 10.1007/s40305-021-00370-9

References

[1] Candès, E., Romberg, J., Tao, T.:Robust uncertainty principles:exact signal reconstruction from highly incomplete frequency information. IEEE Trans. Inform. Theory. 52(2), 489-509(2006)
[2] Eldar, Y., Kutyniok, G.:Compressed Sensing:Theory and Applications. Cambridge University Press, Cambridge (2012)
[3] Foucart, S., Rauhut, H.:A Mathematical Introduction to Compressive Sensing. Birkhäuser Basel (2013). https://doi.org/10.1007/978-0-8176-4948-7
[4] Zhao, Y.-B.:Sparse Optimization Theory and Methods. CRC Press, Boca Raton, FL (2018)
[5] Boche, H., Calderbank, R., Kutyniok, G., Vybiral, J.:Compressed Sensing and Its Applications. Springer, New York (2019)
[6] De Maio, A., Yonina, C., Alexander, M.(eds.):Compressed Sensing in Radar Signal Processing. Cambridge University Press, Cambridge (2019)
[7] Elad, M.:Sparse and Redundant Representations:From Theory to Applications in Signal and Image Processing. Springer, New York (2010)
[8] Patel, V., Chellappa, R.:Sparse representations, compressive sensing and dictionaries for pattern recognition. The First Asian Conference on Pattern Recognition, IEEE. 325-329(2011)
[9] Choi, J., Shim, B., Ding, Y., Rao, B., Kim, D.:Compressed sensing for wireless communications:Useful tips and tricks. IEEE Commun. Surveys&Tutorials. 19(3), 1527-1549(2017)
[10] Chen, S., Donoho, D., Saunders, M.:Atomic decomposition by basis pursuit. SIAM Rev. 43(1), 129-159(2001)
[11] Candès, E., Tao, T.:Decoding by linear programming. IEEE Trans. Inform. Theory. 51(12), 4203-4215(2005)
[12] Candès, E., Wakin, M., Boyd, S.:Enhancing sparsity by reweighted 1-minimization. J. Fourier Anal. Appl. 14, 877-905(2008)
[13] Zhao, Y.-B., Li, D.:Reweighted 1-minimization for sparse solutions to underdetermined linear systems. SIAM J. Optim. 22, 893-912(2012)
[14] Zhao, Y.-B., Kocvara, M.:A new computational method for the sparsest solutions to systems of linear ˇ equations. SIAM J. Optim. 25(2), 1110-1134(2015)
[15] Zhao, Y.-B., Luo, Z.-Q.:Constructing new reweighted 1-algorithms for sparsest points of polyhedral sets. Math. Oper. Res. 42, 57-76(2017)
[16] Tropp, J., Gilbert, A.:Signal recovery from random measurements via orthogonal mathcing pursuit. IEEE Trans. Inform. Theory. 53, 4655-4666(2007)
[17] Needell, D., Vershynin, R.:Signal recovery from incomplete and inaccurate measurements via regularized orthogonal matching pursuit. IEEE J. Sel. Top. Signal Process. 4(2), 310-316(2010)
[18] Dai, W., Milenkovic, O.:Subspace pursuit for compressive sensing signal reconstruction. IEEE Trans. Inform. Theory. 55, 2230-2249(2009)
[19] Needell, D., Tropp, J.:CoSaMP:Iterative signal recovery from incomplete and inaccurate samples. Appl. Comput. Harmon. Anal. 26, 301-321(2009)
[20] Satpathi, S., Chakraborty, M.:On the number of iterations for convergence of CoSaMP and Subspace Pursuit algorithms. Appl. Comput. Harmon. Anal. 43(3), 568-576(2017)
[21] Donoho, D.:De-noising by soft-thresholding. IEEE Trans. Inform. Theory. 41, 613-627(1995)
[22] Fornasier, M., Rauhut, H.:Iterative thresholding algorithms. Appl. Comput. Harmon. Anal. 25(2), 187-208(2008)
[23] Blumensath, T., Davies, M.:Iterative hard thresholding for compressed sensing. Appl. Comput. Harmon. Anal. 27, 265-274(2009)
[24] Blumensath, T., Davies, M.:Normalized iterative hard thresholding:Guaranteed stability and performance. IEEE J. Sel. Top. Signal Process. 4, 298-309(2010)
[25] Blumensath, T.:Accelerated iterative hard thresholding. Signal Process. 92, 752-756(2012)
[26] Bouchot, J., Foucart, S., Hitczenki, P.:Hard thresholding pursuit algorithms:Number of iterations. Appl. Comput. Harmon. Anal. 41, 412-435(2016)
[27] Meng, N., Zhao, Y.-B.:Newton-step-based hard thresholding algorithms for sparse signal recovery. IEEE Trans. Signal Process. 68, 6594-6606(2020)
[28] Zhao, Y.-B.:Optimal k-thresholding algorithms for sparse optimization problems. SIAM J. Optim. 30(1), 31-55(2020)
[29] Zhao, Y.-B., Luo, Z.-Q.:Analysis of optimal thresholding algorithms for compressed sensing. Signal Process. 187, 108148(2021)
[30] Blumensath, T., Davies, M.:Iterative hard thresholding for sparse approximation. J. Fourier Anal. Appl. 14, 629-654(2008)
[31] Foucart, S.:Hard thresholding pursuit:an algorithm for compressive sensing. SIAM J. Numer. Anal. 49(6), 2543-2563(2011)
[32] Tanner, J., Wei, K.:Normalized iterative hard thresholding for matrix completion. SIAM J. Sci. Comput. 35(5), 104-125(2013)
[33] Zhou, S., Xiu, N., Qi, H.:Global and quadratic convergence of Newton hard-thresholding pursuit. J. Mach. Learn. Res. 22(12), 1-45(2021)
[34] Zhou, S., Pan, L., Xiu, N.:Subspace Newton method for the 0-regularized optimization. arXiv:2004.05132(2020)
[35] Jing, M., Zhou, X., Qi, C.:Quasi-Newton iterative projection algorithm for sparse recovery. Neurocomputing 144, 169-173(2014)
[36] Wang, Q., Qu, G.:A new greedy algorithm for sparse recovery. Neurocomputing 275, 137-143(2018)
[37] Grant, M., Boyd, S.:CVX:Matlab software for Disciplined Convex Programming. Version 1.21,(2017)
Options
Outlines

/