Nonlinear eigenvector methods for convex minimization over the numerical range

Research output: Contribution to journalArticlepeer-review

1 Scopus citations


We consider the optimization problem in which a continuous convex function is to be minimized over the joint numerical range of two Hermitian matrices. When those matrices are of large size, solving such problems by convex optimization can be computationally expensive. The goal of this paper is to present a novel nonlinear eigenvector method to accelerate the computation. We will show that the global minimizer of the optimization problem corresponds to a solution of a nonlinear eigenvalue problem with eigenvector nonlinearity (NEPv). The special structure of this NEPv allows for an efficient sequential subspace search algorithm, which is a nonlinear analogue to the NEPv of the commonly applied locally optimal conjugate gradient descent methods for Hermitian linear eigenvalue problems. Our new algorithm can be proven globally convergent to an eigenvector of the NEPv. Implementation details such as block iteration and preconditioning will be discussed. Numerical examples, with applications in computing the coercivity constant of boundary integral operators and solving multicast beamforming problems, show the effectiveness of our approach.

Original languageEnglish
Pages (from-to)1771-1796
Number of pages26
JournalSIAM Journal on Matrix Analysis and Applications
Issue number4
StatePublished - Nov 17 2020

Bibliographical note

Publisher Copyright:
© 2020 Society for Industrial and Applied Mathematics.


  • Eigenvector nonlinearity
  • Nonlinear eigenvalue problem
  • Numerical range
  • Rayleigh quotient
  • Sequential subspace method

ASJC Scopus subject areas

  • Analysis


Dive into the research topics of 'Nonlinear eigenvector methods for convex minimization over the numerical range'. Together they form a unique fingerprint.

Cite this