Abstract
This paper applies an idea of adaptive momentum for the nonlinear conjugate gradient to accelerate optimization problems in sparse recovery. Specifically, we consider two types of minimization problems: a (single) differentiable function and the sum of a non-smooth function and a differentiable function. In the first case, we adopt a fixed step size to avoid the traditional line search and establish the convergence analysis of the proposed algorithm for a quadratic problem. This acceleration is further incorporated with an operator splitting technique to deal with the non-smooth function in the second case. We use the convex ℓ1 and the nonconvex ℓ1- ℓ2 functionals as two case studies to demonstrate the efficiency of the proposed approaches over traditional methods.
Original language | English |
---|---|
Article number | 33 |
Journal | Journal of Scientific Computing |
Volume | 95 |
Issue number | 1 |
DOIs | |
State | Published - Apr 2023 |
Bibliographical note
Publisher Copyright:© 2023, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.
Keywords
- Accelerated gradient momentum
- Convergence rate
- Fixed step size
- Operator splitting
ASJC Scopus subject areas
- Software
- Theoretical Computer Science
- Numerical Analysis
- General Engineering
- Computational Mathematics
- Computational Theory and Mathematics
- Applied Mathematics