Now we consider the *Karhunen-Loeve Transform (KLT)* (also known as *Hotelling
Transform* and *Eigenvector Transform*), which is closely related to the
*Principal Component Analysis (PCA)* and widely used in data analysis in many
fields.

Let be the eigenvector corresponding to the kth eigenvalue
of the covariance matrix , i.e.,

or in matrix form:

As the covariance matrix is Hermitian (symmetric if is real), its eigenvector 's are orthogonal:

and we can construct an unitary (orthogonal if is real) matrix

satisfying

The eigenequations above can be combined to be expressed as:

or in matrix form:

Here is a diagonal matrix . Left multiplying on both sides, the covariance matrix can be diagonalized:

Now, given a signal vector , we can define a unitary (orthogonal if is real) Karhunen-Loeve Transform of as:

where the ith component of the transform vector is the projection of onto :

Left multiplying on both sides of the transform , we get the inverse transform:

We see that by this transform, the signal vector is now expressed in an N-dimensional space spanned by the N eigenvectors () as the basis vectors of the space.