This paper is concerned with subspace acceleration techniques for computing the Crawford number, that is, the distance between zero and the numerical range of a matrix A. Our approach is based on an eigenvalue optimization characterization of the Crawford number. We establish local convergence of order 1 + 2 ˜ 2.4 for an existing subspace method applied to such and other eigenvalue optimization problems involving a Hermitian matrix that depends analytically on one parameter. For the particular case of the Crawford number, we show that the relevant part of the objective function is strongly concave. In turn, this enables us to develop a subspace method that only uses three-dimensional subspaces but still achieves global convergence and a local convergence that is at least quadratic. A number of numerical experiments confirm our theoretical results and reveal that the established convergence orders appear to be tight.
|Number of pages||22|
|Journal||SIAM Journal on Matrix Analysis and Applications|
|State||Published - 2018|
Bibliographical noteFunding Information:
∗Received by the editors April 26, 2017; accepted for publication (in revised form) November 28, 2017; published electronically May 24, 2018. http://www.siam.org/journals/simax/39-2/M112754.html Funding: The work of the second author was supported by the SNSF research project Robust and Fast Solvers for Computing Extremal Quantities of Structured Pseudospectra. †Institute of Mathematics, EPFL, Lausanne CH-1015, Switzerland (email@example.com). ‡Department of Mathematics, University of Geneva, Geneva 1211, Switzerland (Ding.Lu@unige. ch, Bart.Vandereycken@unige.ch).
© 2018 Society for Industrial and Applied Mathematics.
- Coercivity constant
- Complex approximation
- Convergence analysis
- Crawford number
- Eigenvalue optimization
- Subspace acceleration
ASJC Scopus subject areas