The symmetric invariant subspace decomposition algorithm
(SYISDA) for an
symmetric matrix
proceeds as follows.
- Scaling:
- Compute upper and lower bounds on the spectrum
of
and compute
and
such that for
we have
with the mean eigenvalue of
having being mapped to
. Since
the trace of
is the sum of its eigenvalues, the mean eigenvalue can be determined
easily.
- Eigenvalue Smoothing:
-
Let
be a polynomial such that
and
that is, in the limit all
values are mapped to either 0 or 1. Iterate
until
is numerically negligible (in iteration
, say).
- Invariant Subspace Computation:
-
Find an orthogonal basis for the range space
of
and a complementary
orthgonal subspace, i.e.
.
- Decoupling:
-
Update the original
with
, i.e. form
Since any matrix polynomial of a symmetric matrix
has the same eigenvectors as
,
and
are complementary invariant subspaces of
, and hence their
application to
decouples the spectrum of
. The subproblems
and
can now be solved independently and the algorithm applied further
recursively, with the number of subproblems doubling at every step. If
eigenvalues are desired as well, we also update the current eigenvector
matrix. Orthogonality of the eigenvectors is guaranteed due to the exclusive
use of orthogonal transformations.
Note that the PRISM algorithm is able to utilize matrix multiplication by using the mathematical fact that applying a polynomial
to matrix changes its eigenvalues but not its invariant subspaces, i.e., eigenvectors. By appropriate choice of a
polynomial, the eigenvalues can be modified to yield a matrix which is significantly easier to solve than the original
problem. Specifically, we use the first incomplete Beta function (
) which can be used to divide the
eigenvalues into two distinct groups. Most of the work in applying a polynomial to a matrix results from taking
powers of the matrix which requires matrix multiplication. This yields an parallel algorithm which is dominated by the
efficient matrix multiplication kernel.
While the SYISDA algorithm can be used to compute a full eigendecomposition,
it is worthwhile to point out certain mathematical features that
distinguish it from other approaches:
- Ordering of Eigenvalues:
- Assuming that
maps all eigenvalues in
to 0 and all eigenvalues in
to 1,
contains all
the eigenvalues of A which are smaller than
, and
contains the rest. Hence, if one is only interested in
eigenvalues in a certain part of the spectrum, one need not further resolve
diagonal blocks corresponding to parts of the spectrum which are not of
interest.
- Subspaces before Eigenvalues:
- SYISDA is primarily an algorithm that
computes and refines invariant subspaces. If such a subspace has become
one-dimensional, an eigenvalue/eigenvector pair has been found. However, if one is only
interested in finding an orthogonal basis for the subspace spanned by a
certain set of eigenvectors, there is no need to expend the effort to compute
all eigenvalues, and one can terminate when the subspace corresponding to the
desired eigenvalue range has been identified.
- No Problems with Repeated Eigenvalues:
- Clusters of eigenvalues are
quickly gathered in the same subproblem, and are in fact likely to increase
the speed of convergence of the eigenvalue smoothing step. The orthogonality
of eigenvectors is not affected at all by repeated eigenvalues.
Further details of the algorithm are available in PRISM Working
#12.
