Next: Generalized eigenvalue problem Up: algebra Previous: Unitary transform

Eigenvalues and matrix diagonalization

For any matrix , if there exist a vector and a value such that

then and are called the eigenvalue and eigenvector of matrix , respectively. In other words, the linear transformation of vector by only has the effect of scaling (by a factor of ) the vector in the same direction (1-D space).

The eigenvector is not unique but up to any scaling factor, i.e, if is the eigenvector of , so is with any constant . Typically for the uniqueness of , we keep it normalized so that .

To obtain , we rewrite the above equation as

For this homogeneous equation system to have non-zero solutions for , the determinant of its coefficient matrix has to be zero:

This is the characteristic polynomial equation of matrix . Solving this th order equation of , we get eigenvalues . Substituting each back into the equation system, we get the corresponding eigenvector . We can put all eigen-equations together and have

If is positive definite, i.e., for any vector , then all eigenvalues are positive.

Defining the eigenvalue matrix (a diagonal matrix) and eigenvector matrix as

we can write the eigen-equations in more compact forms:

We see that can be diagonalized by its eigenvector matrix composed of all its eigenvectors to a diagonal matrix composed of its eigenvalues .

We further have:

and in general

Assuming , we have the following:

• has the same eigenvalues and eigenvectors as .

Proof: As a matrix and its transpose have the same determinant, they have the same characteristic polynomial:

therefore they have the same eigenvalues and eigenvectors.

• The eigenvalues and eigenvectors of are the complex conjugate of the eigenvalues and eigenvectors as .

• has the same eigenvectors as , but its eigenvalues are .

Proof:

• has the same eigenvectors as , but its eigenvalues are , where is a positive integer.

Proof:

This result can be generalized to

• In particular when , i.e., the eigenvalues of are .

Proof: Left multiplying on both sides of we get

Dividing both sides by we get

• The eigenvalues of a matrix are invariant under any unitary transform , where is unitary, i.e., , or

Proof:

Let and be the eigenvalue and eigenvector matrices of a square matrix :

and and be the eigenvalue and eigenvector matrices of , a unitary transform of :

Left-multiplying on both sides we get the eigenequation of

We see that and have the same eigenvalues and their eigenvector matrices are related by or .

• Given all eigenvalues of a matrix , its trace and determinant can be obtained as

• The spectrum of an square matrix is the set of its eigenvalues . The spectral radius of , denoted by , is the maximum of the absolute values of the elements of its spectrum:

where is the modulus of a complex number . If all eigenvalues are sorted such that then . As the eigenvalues of are , .

• If is Hermitian (symmetric if real) (e.g., the covariance matrix of a random vector)), then
1. all of its eigenvalues are real, and
2. all of its eigenvectors are orthogonal.

Proof:

Let and be an eigenvalue of a Hermitian matrix and the corresponding eigenvector satisfying , then we have

On the other hand, we also have

i.e., is real.

To show the eigenvectors are orthogonal, consider

similarly, we also have

But the left-hand sides of the two equations above are the same:

therefoe the difference of their right-hand sides must be zero:

If , we get , i.e., the eigenvectors corresponding to different eigenvalues are orthogonal.

Q.E.D.

When all eigenvectors are normalized , they become orthonormal

i.e., the eigenvector matrix is unitary (orthogonal if is real):

and we have

Left and right multiplying by and respectively on the two sides, we get

We see that can be written as a linear combination of matrices weighted by ().

Next: Generalized eigenvalue problem Up: algebra Previous: Unitary transform
Ruye Wang 2015-04-27