Accelerated Sparse Recovery via Gradient Descent with Nonlinear Conjugate Gradient Momentum

Mengqi Hu, Yifei Lou, Bao Wang, Ming Yan, Xiu Yang, Qiang Ye

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

This paper applies an idea of adaptive momentum for the nonlinear conjugate gradient to accelerate optimization problems in sparse recovery. Specifically, we consider two types of minimization problems: a (single) differentiable function and the sum of a non-smooth function and a differentiable function. In the first case, we adopt a fixed step size to avoid the traditional line search and establish the convergence analysis of the proposed algorithm for a quadratic problem. This acceleration is further incorporated with an operator splitting technique to deal with the non-smooth function in the second case. We use the convex ℓ1 and the nonconvex ℓ1- ℓ2 functionals as two case studies to demonstrate the efficiency of the proposed approaches over traditional methods.

Original languageEnglish
Article number33
JournalJournal of Scientific Computing
Volume95
Issue number1
DOIs
StatePublished - Apr 2023

Bibliographical note

Publisher Copyright:
© 2023, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.

Keywords

  • Accelerated gradient momentum
  • Convergence rate
  • Fixed step size
  • Operator splitting

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science
  • Numerical Analysis
  • General Engineering
  • Computational Mathematics
  • Computational Theory and Mathematics
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Accelerated Sparse Recovery via Gradient Descent with Nonlinear Conjugate Gradient Momentum'. Together they form a unique fingerprint.

Cite this