Accelerated Sparse Recovery via Gradient Descent with Nonlinear Conjugate Gradient Momentum

Mengqi Hu, Yifei Lou, Bao Wang, Ming Yan, Xiu Yang, Qiang Ye

Research output: Contribution to journalArticlepeer-review

Abstract

This paper applies an idea of adaptive momentum for the nonlinear conjugate gradient to accelerate optimization problems in sparse recovery. Specifically, we consider two types of minimization problems: a (single) differentiable function and the sum of a non-smooth function and a differentiable function. In the first case, we adopt a fixed step size to avoid the traditional line search and establish the convergence analysis of the proposed algorithm for a quadratic problem. This acceleration is further incorporated with an operator splitting technique to deal with the non-smooth function in the second case. We use the convex ℓ1 and the nonconvex ℓ1- ℓ2 functionals as two case studies to demonstrate the efficiency of the proposed approaches over traditional methods.

Original languageEnglish
Article number33
JournalJournal of Scientific Computing
Volume95
Issue number1
DOIs
StatePublished - Apr 2023

Bibliographical note

Funding Information:
YL acknowledges the support from NSF CAREER DMS-1846690. BW is supported by NSF DMS-1924935, DMS-1952339, DMS-2152762, and DMS-2208361. BW also acknowledges support from the office of Science of the department of energy under grant number DE-SC0021142 and DE-SC0002722. MY was supported by the NSF DMS-2012439 and Shenzhen Science and Technology Program ZDSYS20211021111415025. XY acknowledges the support from NSF CAREER DMS-2143915. XY was also partly supported by DOE, Office of Science, Office of Advanced Scientific Computing Research (ASCR) as part of Multifaceted Mathematics for Rare, Extreme Events in Complex Energy and Environment Systems (MACSER). QY acknowledges the support from the NSF grant DMS-1821144.

Publisher Copyright:
© 2023, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.

Keywords

  • Accelerated gradient momentum
  • Convergence rate
  • Fixed step size
  • Operator splitting

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science
  • Numerical Analysis
  • Engineering (all)
  • Computational Mathematics
  • Computational Theory and Mathematics
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Accelerated Sparse Recovery via Gradient Descent with Nonlinear Conjugate Gradient Momentum'. Together they form a unique fingerprint.

Cite this