Eigensolvers based on polynomials of two variables

Andrew Knyazev

Center for Computational Mathematics, University of Colorado at Denver
P.O. Box 173364, Campus Box 170, Denver, CO 80217-3364
Phone: (303) 556-8442. Fax: (303) 556-8550
WWW: http://www-math.cudenver.edu /~aknyazev
andrew.knyazev@cudenver.edu
 

Abstract

An attempt to use a preconditioner to solve an eigenvalue problem naturally leads to an iterative method based on polynomials of two variables. An approximation to an eigenvector belongs to the corresponding generalized Krylov subspace, generated using two matrices. Using this idea, we give a formal definition of preconditioned eigensolvers. This definition covers most known preconditioned eigensolvers with a fixed preconditioner. Such preconditioned eigensolvers can be used in matrix-free environment, i.e. when all the matrices of the eigenvalue problem and the preconditioner are available only in the form of functions that compute matrix-vector products for a given vector.

We present a survey of some theoretical convergence rate estimates for preconditioned iterative methods for symmetric eigenvalue problems. We consider preconditioned analogs of the power method, the steepest descent/ascent method, the Lanczos-type
methods, and the conjugate gradient methods. In an attempt to find the optimal method in the new class, we describe the Locally Optimal Block Preconditioned Conjugate Gradient (LOBPCG) Method, based on a local optimization of a three-term recurrence.

We discuss a delicate problem of computing accurate two-sided bounds for eigenvalues in matrix-free environment.

To be able to compare numerically different methods in the class, with different preconditioners, we suggest a common system of model tests, using random preconditioners and initial guesses. As the "ideal'' control algorithm for computing the extreme eigenpair, we propose the standard preconditioned conjugate gradient method for finding an eigenvector as an element of the null-space of the corresponding homogeneous system of linear equations under the assumption that the eigenvalue is known. We recommend that every new preconditioned eigensolver be compared with this ideal algorithm on our model test problems in terms of the speed of convergence, costs of every iterations and memory requirements. We provide such comparison for our LOBPCG Method. Numerical results establish that our algorithm is practically as efficient as the ideal algorithm when the same preconditioner is used in both methods. We also show numerically that the LOBPCG Method provides approximations to first eigenpairs of about the same quality as those by the much more expensive global optimization method on the same generalized block Krylov subspace.

Finally, direct numerical comparisons with the Jacobi-Davidson method show that our LOBPCG method is more robust and converges almost two times faster.

A MATLAB code of the LOBPCG method and the Preconditioned Eigensolvers Benchmarking are available at http://www-math .cudenver.edu/~aknyazev/software/CG/

The talk is mostly based on the papers by the speaker: