next up previous
Next: Vector and matrix differentiation Up: algebra Previous: Eigenvalues and eigenvectors

Positive/Negative (semi)-definite matrices

Associated with a given symmetric matrix ${\bf A}$, we can construct quadratic form ${\bf v}^T{\bf A}{\bf v}$ where ${\bf v}\ne {\bf0}$ is an any non-zero vector. The matrix ${\bf A}$ is said to be

For example, consider the covariance matrix of a random vector ${\bf x}$

{\bf\Sigma}_x=E[ ({\bf x}-{\bf m}_x)({\bf x}-{\bf m}_x)^T ]

The corresponding quadratic form is
$\displaystyle {\bf v}^T{\bf\Sigma}_x{\bf v}$ $\textstyle =$ $\displaystyle {\bf v}^TE[ ({\bf x}-{\bf m}_x)({\bf x}-{\bf m}_x)^T ]{\bf v}$  
  $\textstyle =$ $\displaystyle E[{\bf v}^T ({\bf x}-{\bf m}_x)({\bf x}-{\bf m}_x)^T {\bf v}]
=E(s^2)\ge 0$  

where $s={\bf v}^T({\bf x}-{\bf m}_x)$ is a scalar. Therefore ${\bf\Sigma}_x$ is positive semi-define.

${\bf A}>0$ iff all of its eigenvalues are greater than zero:

\lambda_i > 0,\;\;\;\;i=(1,\cdots,n)

As the eigenvalues of ${\bf A}^{-1}$ are $1/\lambda_i,\;i=(1,\cdots,n)$, we have ${\bf A}>0$ iff ${\bf A}^{-1}>0$.

Ruye Wang 2014-08-15