We consider the optimization problem in which a continuous convex function is to be minimized over the joint numerical range of two Hermitian matrices. When those matrices are of large size, solving such problems by convex optimization can be computationally expensive. The goal of this paper is to present a novel nonlinear eigenvector method to accelerate the computation. We will show that the global minimizer of the optimization problem corresponds to a solution of a nonlinear eigenvalue problem with eigenvector nonlinearity (NEPv). The special structure of this NEPv allows for an efficient sequential subspace search algorithm, which is a nonlinear analogue to the NEPv of the commonly applied locally optimal conjugate gradient descent methods for Hermitian linear eigenvalue problems. Our new algorithm can be proven globally convergent to an eigenvector of the NEPv. Implementation details such as block iteration and preconditioning will be discussed. Numerical examples, with applications in computing the coercivity constant of boundary integral operators and solving multicast beamforming problems, show the effectiveness of our approach.
|Number of pages||26|
|Journal||SIAM Journal on Matrix Analysis and Applications|
|State||Published - Nov 17 2020|
Bibliographical noteFunding Information:
\ast Received by the editors December 20, 2018; accepted for publication (in revised form) by K. Meerbergen August 28, 2020; published electronically November 17, 2020. An earlier version of this paper was prepared while the author was affiliated with the Section of Mathematics, University of Geneva. https://doi.org/10.1137/18M1234473 Funding: This work was supported by the SNSF under research project 169115. \dagger Department of Mathematics, University of Kentucky, Lexington, KY 40506 USA (Ding.Lu@ uky.edu).
© 2020 Society for Industrial and Applied Mathematics.
- Eigenvector nonlinearity
- Nonlinear eigenvalue problem
- Numerical range
- Rayleigh quotient
- Sequential subspace method
ASJC Scopus subject areas