Y. Zhou, "Eigenvalue Computation from the Optimization Perspective: On Jacobi-Davidson, IIGD, RQI, and Newton Updates," Preprint ANL/MCS-P1074-0803, August 2003. [pdf]
We discuss the close connection between eigenvalue computation and optimization using the Newton method and subspace methods. From the connection we derive a new class of Newton updates. The new update formulation is similar to the well-known Jacobi-Davidson method. This similarity leads to simplified versions of the Jacobi-Davidson method and the inverse iteration generalized Davidson (IIGD) method. We prove that the projection subspace augmented by the updating direction from each of these methods is able to include the Rayleigh quotient iteration (RQI) direction. Hence, the locally quadratic (cubic for normal matrices) convergence rate of the RQI method is retained and strengthened by the subspace methods. The theory is supported by extensive numerical results. Preconditioned formulations are also briefly discussed for large scale eigenvalue problems.