next up previous
Next: Properties of Orthogonal Transforms Up: Fourier_Analysis Previous: 2D Fourier Filtering

Unitary and Orthogonal Transforms

A square matrix $A=[A_1\;A_2\;\cdots\;A_n]$ ($A_i$ for the ith column vector of $A$) is unitary if its inverse is equal to its conjugate transpose, i.e., $A^{-1}=A^{*T}$. In particular, if a unitary matrix is real $A=A^*$, then $A^{-1}=A^T$ and it is orthogonal. Both the column and row vectors ( $A_i, i=1,\cdots,n$) of a unitary or orthogonal matrix are orthogonal (perpendicular to each other) and normalized (of unit length), or orthonormal, i.e., their inner product satisfies:

\begin{displaymath}(A_i,A_j)=A_i^{*T} A_j=\delta_{i,j}=\left\{ \begin{array}{ll}
1 & \mbox{if } i=j  0 & \mbox{otherwise} \end{array} \right.
\end{displaymath}

These $n$ orthonormal vectors can be used as the basis vectors of the n-dimensional vector space.

Any unitary (orthogonal) matrix $A$ can define a unitary (orthogonal) transform of a vector $X=[x_1,\cdots,x_n]^T$:


\begin{displaymath}\left\{ \begin{array}{ll}
Y=A^{*T}X=A^{-1} X & \mbox{(forwa...
...\\
\par
X=AY & \mbox{(inverse transform)}
\end{array} \right. \end{displaymath}

or in tabular form:

\begin{displaymath}\left\{ \begin{array}{ll}
\left[ \begin{array}{c} y_1 y_2...
...^n y_i \; A_i & \mbox{(inverse transform)}
\end{array} \right. \end{displaymath}

The first equation above is the forward transform and can be written in component form as:

\begin{displaymath}y_i=A_i^{*T}X=(A_i,X)=\sum_{j=1}^n a^*_{i,j} x_j \end{displaymath}

This is the transform coefficient representing the projection of vector $X$ onto the ith column vector $A_i$ of the transform matrix $A$ (for the ith basis of the n-dimensional space).

projection0.gif

The second equation is the inverse transform and can be written in component form as:

\begin{displaymath}x_j=\sum_{i=1}^n a_{j,i} y_i \end{displaymath}

This equation represents the signal vector $X$ as a linear combination (weighted sum) of the $n$ column vectors $A_i,A_2, \cdots, A_n$ of the transform matrix $A$. Geometrically, $X$ is a vector in the n-dimensional space spanned by the $n$ orthonormal vectors as the bases, and each coefficient (coordinate) $y_i$ is the projection of $X$ onto the corresponding basis vector $A_i$.

unitary_transform_1.gif

As the n-dimensional space can be spanned by the column vectors of any n by n unitary (orthogonal) matrix, a vector $X$ in the space can be represented by any of such matrices, each defining a different transform.

Examples:

unitary_transform_2.gif

An orthogonal ( unitary) transform $Y=A^T X$ (where $A^T=A^{-1}$) can be interpreted geometrically as the rotation of the vector $X$ about the origin, or equivalently, the representation of the same vector in a rotated coordinate system. A orthogonal (unitary) transform $Y=A^T X$ does not change the vector's length:

\begin{displaymath}\vert Y\vert^2= Y^T Y=(A^{T}X)^{T}(A^{T}X)=X^{T}AA^{T}X=X^{T}X=\vert X\vert^2 \end{displaymath}

as $AA^{T}=AA^{-1}=I$. This is the Parseval's relation. If $X$ is interpreted as a signal, then its length $\vert X\vert^2=\vert Y\vert^2$ represents the total energy or information contained in the signal, which is preserved during any unitary transform.


next up previous
Next: Properties of Orthogonal Transforms Up: Fourier_Analysis Previous: 2D Fourier Filtering
Ruye Wang 2003-11-17