next up previous
Next: Jacobi iteration Up: Solving Linear Algebraic Equations Previous: SVD for solving linear

Householder transformation and QR decomposition

A Householder transformation of a vector ${\bf x}$ is its reflection with respect a plane (or hyper-plane) through the origin represented by its normal vector ${\bf v}$ of unit length ${\bf v}^T{\bf v}=\vert\vert{\bf v}\vert\vert^2=1$, which can be found as

{\bf x}'={\bf x}-2{\bf v}{\bf v}^T{\bf x}

where ${\bf v}{\bf v}^T{\bf x}$ is the projection of ${\bf x}$ onto ${\bf v}$. In general, the projection of ${\bf x}$ onto any vector ${\bf u}$ is

{\bf p}_{\bf u}({\bf x})=\frac{{\bf x}^T{\bf u}}{\vert\vert...
...=\frac{{\bf x}^T{\bf u}}{\vert\vert{\bf u}\vert\vert^2}{\bf u}

The reflection of ${\bf x}$ can be considered as a linear transformation represented by a matrix ${\bf P}$ applied to ${\bf x}$:

\begin{displaymath}{\bf x}'={\bf P}{\bf x}=({\bf I}-2{\bf v}{\bf v}^T) {\bf x} \end{displaymath}

where ${\bf P}={\bf I}-2{\bf v}{\bf v}^T$ is a Householder matrix.


Obviously ${\bf P}$ is symmetric

\begin{displaymath}{\bf P}^T=({\bf I}-2{\bf v}{\bf v}^T)^T={\bf I}-2{\bf v}{\bf v}^T={\bf P} \end{displaymath}

and it is also the inverse of its own:
$\displaystyle {\bf P}{\bf P}$ $\textstyle =$ $\displaystyle ({\bf I}-2{\bf v}{\bf v}^T)({\bf I}-2{\bf v}{\bf v}^T)
={\bf I}-4{\bf v}{\bf v}^T+4{\bf v}{\bf v}^T{\bf v}{\bf v}^T$  
  $\textstyle =$ $\displaystyle {\bf I}-4{\bf v}{\bf v}^T+4{\bf v}{\bf v}^T={\bf I}$  

We therefore have

\begin{displaymath}{\bf P}={\bf P}^{-1}={\bf P}^T \end{displaymath}

i.e., ${\bf P}$ is an orthogonal matrix (satisfying ${\bf P}^{-1}={\bf P}^T$).

As any vector ${\bf u}$ can be normalized ${\bf v}={\bf u}/\vert\vert{\bf u}\vert\vert$ to have a unit norm, the Householder matrix defined above can be written as:

{\bf P}={\bf I}-2{\bf v}{\bf v}^T
={\bf I}-\frac{2{\bf u}{\bf u}^T}{\vert\vert{\bf u}\vert\vert^2}

We define specifically a vector

\begin{displaymath}{\bf u}={\bf x}-\vert\vert{\bf x}\vert\vert{\bf e}_1 \end{displaymath}

where ${\bf x}=[x_1,\cdots,x_N]^T$ is any N-D vector and ${\bf e}=[1,0,\cdots,0]^T$ is the first standard basis vector with all elements equals zero except the first one. The norm squared of this vector can be found to be
$\displaystyle \vert\vert{\bf u}\vert\vert^2$ $\textstyle =$ $\displaystyle {\bf u}^T{\bf u}
=({\bf x}-\vert\vert{\bf x}\vert\vert{\bf e}_1)...
...}\vert\vert{\bf x}^T{\bf e}_1+\vert\vert{\bf x}\vert\vert^2{\bf e}_1^T{\bf e}_1$  
  $\textstyle =$ $\displaystyle \vert\vert{\bf x}\vert\vert^2-2 \vert\vert{\bf x}\vert\vert x_1+\...
=2(\vert\vert{\bf x}\vert\vert^2-\vert\vert{\bf x}\vert\vert x_1)$  

When the Householder matrix ${\bf P}_1$ based on this particular vector ${\bf u}$ is applied to the vector ${\bf x}$ itself, it produces its reflection
$\displaystyle {\bf x}'={\bf P}_1{\bf x}$ $\textstyle =$ $\displaystyle \left({\bf I}-2{\bf v}{\bf v}^T\right) {\bf x}
=\left({\bf I}-\f...
...ert{\bf x}\vert\vert{\bf e}_1)^T}{\vert\vert{\bf u}\vert\vert^2}\right) {\bf x}$  
  $\textstyle =$ $\displaystyle {\bf x}-\frac{2{\bf u}(\vert\vert{\bf x}\vert\vert^2-\vert\vert{\...
...rt\vert{\bf x}\vert\vert{\bf e}_1=\vert\vert{\bf x}\vert\vert\;[1,0,\cdots,0]^T$  

We see that all elements in ${\bf x}$ except the first one are eliminated to zero. This feature of the Householder transformation is the reason why it is widely used.


The Householder transformation finds many applications in numerical computation. For example, it can be used to convert a given matrix ${\bf A}$ into either a bidiagonal or tridiagonal form, which is needed in the algorithms for solving SVD and eigenvalue problems. The Householder transformation can also be used to carry out QR decomposition of an $N$ by $N$ square matrix ${\bf A}$:

\begin{displaymath}{\bf A}={\bf Q}{\bf R} \end{displaymath}

where ${\bf Q}^T={\bf Q}^{-1}$ is an orthogonal matrix and ${\bf R}$ is an upper triangular matrix. Specifically, we first construct a Householder matrix ${\bf Q}_1$ based on the first column vector ${\bf c}_1=[a_{11},\;a_{21},\;\cdots,a_{N1}]^T$ of ${\bf A}=[{\bf c}_1,\cdots,{\bf c}_N]$, i.e., ${\bf u}={\bf c}_1-\vert\vert{\bf c}_1\vert\vert{\bf e}_1$, by which the last $N-1$ elements of the first column of ${\bf A}$ will become zero:

{\bf Q}_1{\bf A}=\left[\begin{array}{c\vert ccc}a'_{11}&a'_...
0& & &  \vdots & &{\bf A}'&  0& & & \end{array}\right]

We then construct second transformation matrix ${\bf Q}_2$

\begin{displaymath}{\bf Q}_2=\left[\begin{array}{c\vert ccc}
1&0&\cdots&0 \hl...
...& & &  \vdots & &{\bf Q}'_2 & \\
0& & & \end{array}\right]

where ${\bf Q}'_2$ is an ($N-1$)-dimensional Householder matrix constructed based on the first column of ${\bf A}'$, to zero the last $N-2$ elements of this column:

{\bf Q}_2{\bf A}'={\bf Q}_2{\bf Q}_1{\bf A}=
... \vdots & \vdots & &{\bf A}''&  0&0& & & \end{array}\right]

This process is repeated with the kth transformation matrix

{\bf Q}_k=\left[\begin{array}{cc}{\bf I}_{k-1}&{\bf0} {\bf0}&{\bf Q}'_k\end{array}\right]

where ${\bf Q}'_k$ is an ($N-k+1$)-dimensional Householder matrix constructed based on the first column of the sub-matrix ${\bf A}^{(k-1)}$ of the same dimension obtained from the previous step.

After $N-1$ such iterations ${\bf A}$ becomes an upper triangular matrix:

{\bf R}={\bf Q}_{N-1}\cdots{\bf Q}_1{\bf A}

Pre-multiplying ${\bf Q}=({\bf Q}_{N-1}\cdots{\bf Q}_1)^{-1}$ on both sides we get QR decomposition of ${\bf A}$:

{\bf A}=({\bf Q}_{N-1}\cdots{\bf Q}_1)^{-1}{\bf R}={\bf Q}{\bf R}

Here ${\bf Q}$ satisfies

{\bf Q}=({\bf Q}_{N-1}\cdots{\bf Q}_1)^{-1}
={\bf Q}_1^{-1...
...Q}_1^T\cdots{\bf Q}_{N-1}^T
=({\bf Q}_{N-1}\cdots{\bf Q}_1)^T

i.e., ${\bf Q}^{-1}$ is orthogonal and so is ${\bf Q}$.

QR decomposition is widely used in different algorithms (e.g., SVD, eigenvalue problems, etc.), and it can also be used to solve the linear system ${\bf A}{\bf x}={\bf b}$:

\begin{displaymath}{\bf A}{\bf x}={\bf Q}{\bf R}{\bf x}={\bf Q}{\bf y}={\bf b} \end{displaymath}

where ${\bf y}={\bf R}{\bf x}$ can be obtained as:

\begin{displaymath}{\bf y}={\bf Q}^{-1}{\bf b}={\bf Q}^T{\bf b} \end{displaymath}

Then we can find ${\bf x}$ by solving

\begin{displaymath}{\bf R}{\bf x}={\bf y} \end{displaymath}

As ${\bf R}$ is an upper triangular matrix, ${\bf x}$ can be obtained by back-substitution.

The Householder transformation can also be used to carry out tridiagonalization of an $N$ by $N$ matrix ${\bf A}$. Specifically we first define an $N\times N$ matrix as shown below:

\begin{displaymath}{\bf P}_1=\left[\begin{array}{c\vert ccc}
1&0&\cdots&0 \hl...
...  \vdots & &{\bf P}^{(N-1)} & \\
0& & & \end{array}\right]

where the lower-right part of ${\bf P}_1$ is an $N-1$ dimensional Householder matrix based on an $N-1$ dimensional vector $[a_{21},\cdots,a_{N1}]^T$ containing the last $N-1$ elements of the first column of ${\bf A}$. We first pre-multiply ${\bf P}_1$ to ${\bf A}$ to get:
$\displaystyle {\bf P}_1{\bf A}$ $\textstyle =$ $\displaystyle \left[\begin{array}{c\vert ccc}
1&0&\cdots&0  \hline 0& & & \\...
...a_{21}& & &   \vdots & &{\bf A}^{(N-1)} &   a_{N1} & & & \end{array}\right]$  
  $\textstyle =$ $\displaystyle \left[\begin{array}{c\vert ccc}a_{11}&a_{12}&\cdots&a_{1N}  \hl...
& & {\bf P}^{(N-1)}{\bf A}^{(N-1)} & \end{array}\right]$  
  $\textstyle =$ $\displaystyle \left[\begin{array}{c\vert ccc}a_{11}&a_{12}&\cdots&a_{1N}  \hl...
...&   \vdots & &{\bf P}^{(N-1)}{\bf A}^{(N-1)} &   0 & & & \end{array}\right]$  

where ${\bf A}^(N-1)$ is an $N-1$ dimensional matrix containing the lower-right $N-1$ by $N-1$ elements of ${\bf A}$, and the lower-left block of the resulting matrix only has a non-zero element as the result of the application of the Householder matrix. We next post-multiply ${\bf P}$ to the result above to get
$\displaystyle {\bf A}'={\bf P}_1{\bf A}{\bf P}_1$ $\textstyle =$ $\displaystyle \left[\begin{array}{c\vert ccc}a_{11}&a_{12}&\cdots&a_{1N}  \hl...
...&   \vdots & &{\bf P}^{(N-1)}{\bf A}^{(N-1)} &   0 & & & \end{array}\right]$  
    $\displaystyle \left[\begin{array}{c\vert ccc}
1&0&\cdots&0  \hline 0& & &   \vdots & &{\bf P}^{(N-1)}&  0& & & \end{array}\right]$  
  $\textstyle =$ $\displaystyle \left[\begin{array}{c\vert ccc}a_{11}& &
...\bf P}^{(N-1)}{\bf A}^{(N-1)}{\bf P}^{(N-1)} &
  0 & & & \end{array}\right]$  
  $\textstyle =$ $\displaystyle \left[\begin{array}{c\vert cccc}a_{11}&a'_{12}&0&\cdots&0  \hli...
...f P}^{(N-1)}{\bf A}^{(N-1)}{\bf P}^{(N-1)} &
  0 & & & & \end{array}\right]$  

We next multiply both sides of the above by

\begin{displaymath}{\bf P}_2=\left[\begin{array}{cc\vert ccc}
1&0&0&\cdots&0 ...
...s &\vdots& &{\bf P}^{(N-2)} & \\
0&0& & & \end{array}\right]

to get

{\bf P}_2{\bf A}'{\bf P}_2={\bf P}_2{\bf P}_1{\bf A}{\bf P}...
...}_2{\bf A}^{(N-2)}{\bf P}_2&  0 &0& & & & \end{array}\right]

This process is recursively repeated until after $N-2$ iterations the $N$ by $N$ matrix ${\bf A}$ is converted into a tridiagonal matrix.

next up previous
Next: Jacobi iteration Up: Solving Linear Algebraic Equations Previous: SVD for solving linear
Ruye Wang 2013-04-08