In other words, \(\Lambda( A ) = \Lambda( A_{00} ) \cup
\Lambda( A_{11} ) \) and eigenvectors of \(A \) can be easily constructed from eigenvalues of \(A_{00} \) and \(A_{11} \text{.}\)
This insight allows us to deflate a matrix when stategically placed zeroes (or, rather, acceptably small entries) appear as part of the QR algorithm. Let us continue to focus on the Hermitian eigenvalue problem.
Homework10.2.3.1.
Let \(A \in \Cmxm \) be a Hermitian matrix and \(V \in \Cmxm \) be a unitary matrix such that
If \(V_{00} \) and \(V_{11} \) are unitary matrices such that \(V_{00}^H A_{00} V_{00} = \Lambda_0 \) and \(V_{11}^H A_{11} V_{11} = \Lambda_1 \text{,}\) are both diagonal, show that
The point of this last exercise is that if at some point the QR algorithm yields a block diagonal matrix, then the algorithm can proceed to find the spectral decompositions of the blocks on the diagonal, updating the matrix, \(V \text{,}\) in which the eigenvectors are accumulated.
Now, since it is the last colum of \(V^{(k)} \) that converges fastest to an eigenvector, eventually we expect \(A^{(k)} \) computed as part of the QR algorithm to be of the form
The idea is that if the magnitudes of the off-diagonal elements of the last row are small relative to the eigenvalues, then they can be considered to be zero. The sum of the absolute values of the diagonal elements is an estimate of the sizes of the eigenvalues. We will refine this criteria later.
Homework10.2.3.2.
Copy SimpleShiftedQRAlg.m into SimpleShiftedQRAlgWithDeflation and modify it to add deflation.
function [ Ak, V ] = SimpleShiftedQRAlgWithDeflation( A, maxits, illustrate, delay )
It is possible that deflation can happen anywhere in the matrix and one should check for that. However, it is most likely to happen in the last row and column of the active part of the matrix.