Command Palette

Search for a command to run...

Linear Methods of AI

Symmetric and Hermitian Matrices

Definitions of Symmetric and Hermitian

In linear algebra, we recognize two special types of matrices that have very interesting properties. Imagine a mirror that perfectly reflects objects. Symmetric and Hermitian matrices have a similar mathematical "mirror" property.

A real square matrix ARn×nA \in \mathbb{R}^{n \times n} is called symmetric if it equals its transpose:

AT=AA^T = A

Whereas a complex square matrix ACn×nA \in \mathbb{C}^{n \times n} is called Hermitian if it equals its adjoint:

AH=AA^H = A

Let's look at an example to understand this concept more clearly:

A=(123245356)A = \begin{pmatrix} 1 & 2 & 3 \\ 2 & 4 & 5 \\ 3 & 5 & 6 \end{pmatrix}

Notice that the element at position (i,j)(i,j) is the same as the element at position (j,i)(j,i). For example a12=a21=2a_{12} = a_{21} = 2 and a13=a31=3a_{13} = a_{31} = 3.

Relationship Between Symmetric and Hermitian

Every real symmetric matrix is actually also a complex Hermitian matrix. Why is that? Because when we consider a real matrix as a complex matrix, the complex conjugate of a real number is the number itself.

Real symmetric matrices are a special case of complex Hermitian matrices.

This means all properties that apply to Hermitian matrices also apply to symmetric matrices. However, symmetric matrices have the additional advantage that all their elements are real.

Diagonal of Hermitian Matrices

One interesting property of Hermitian matrices is that all their diagonal elements are always real numbers. Let's see why this happens.

For a Hermitian matrix ACn×nA \in \mathbb{C}^{n \times n}, we have AH=AA^H = A. This means for every diagonal element:

aii=aiia_{ii} = \overline{a_{ii}}

Because aii=aiia_{ii} = \overline{a_{ii}}, then aiiRa_{ii} \in \mathbb{R} for all ii.

So, even though Hermitian matrices can have complex elements off the diagonal, their diagonal elements are definitely real. This is a direct consequence of the Hermitian definition.

Quadratic Forms

Symmetric and Hermitian matrices have a special feature in terms of quadratic forms. Let's see how they work with vectors.

If we have a symmetric matrix ARn×nA \in \mathbb{R}^{n \times n} and a vector xRnx \in \mathbb{R}^n, then we can form a quadratic function:

q:RnRq : \mathbb{R}^n \to \mathbb{R}
q(x)=xTAxq(x) = x^T A x

For a Hermitian matrix ACn×nA \in \mathbb{C}^{n \times n}, the result xHAxx^H A x always produces a real number, even though AA and xx are complex.

Let's prove why this happens:

xHAx=xHAxx^H A x = x^H A x
=(xHAx)H= (x^H A x)^H
=xHAH(xH)H= x^H A^H (x^H)^H
=xHAHx= x^H A^H x
=xHAx= x^H A x

Because xHAx=(xHAx)Hx^H A x = (x^H A x)^H, then xHAxx^H A x is a real number.

So we get the quadratic form for the complex case:

q:CnRq : \mathbb{C}^n \to \mathbb{R}
q(x)=xHAxq(x) = x^H A x

Basic Vector Properties

Before discussing eigenvalues, let's understand the basic properties of vectors that we will use. For a vector xRnx \in \mathbb{R}^n or xCnx \in \mathbb{C}^n, we have:

xTx0 and xHx0x^T x \geq 0 \text{ and } x^H x \geq 0
xTx=0x=0x^T x = 0 \Leftrightarrow x = 0
xHx=0x=0x^H x = 0 \Leftrightarrow x = 0

This is because:

xTx=k=1nxk2x^T x = \sum_{k=1}^n x_k^2
xHx=k=1nxkxk=k=1nxk2x^H x = \sum_{k=1}^n \overline{x_k} x_k = \sum_{k=1}^n |x_k|^2

Both forms are always non-negative and only equal to zero if all vector components are zero.

Eigenvalues Are Always Real

This is one of the most amazing properties of symmetric and Hermitian matrices. All eigenvalues of symmetric or Hermitian matrices are always real numbers.

Let's look at the proof. Suppose ACn×nA \in \mathbb{C}^{n \times n} is a Hermitian matrix with AH=AA^H = A. If Av=λvA \cdot v = \lambda \cdot v with v0v \neq 0, then:

λvHv=vH(λv)\lambda \cdot v^H v = v^H (\lambda \cdot v)
=vH(Av)= v^H (A \cdot v)
=vHAv= v^H A v
=vHAHv= v^H A^H v
=(Av)Hv= (A \cdot v)^H v
=(λv)Hv= (\lambda \cdot v)^H v
=λvHv= \overline{\lambda} \cdot v^H v

Because vHv0v^H v \neq 0, we can conclude that λ=λ\lambda = \overline{\lambda}, so λR\lambda \in \mathbb{R}.

For real symmetric matrices, since they are also Hermitian matrices, their eigenvalues are also always real.

Orthogonality of Eigenvectors

Eigenvectors corresponding to different eigenvalues in symmetric or Hermitian matrices are always orthogonal to each other. This is a very useful property in various applications.

Let's prove this property. Suppose ACn×nA \in \mathbb{C}^{n \times n} is a Hermitian matrix with:

Av=λv with v0A v = \lambda v \text{ with } v \neq 0
Aw=μw with w0A w = \mu w \text{ with } w \neq 0
λμ\lambda \neq \mu

We know that μ=μ\overline{\mu} = \mu because eigenvalues are real. Now:

μ(wHv)=μ(wHv)\mu (w^H v) = \mu (w^H v)
=(μw)Hv= (\mu w)^H v
=(Aw)Hv= (A w)^H v
=wHAHv= w^H A^H v
=wHAv= w^H A v
=wH(λv)= w^H (\lambda v)
=λ(wHv)= \lambda (w^H v)

So (μλ)(wHv)=0(\mu - \lambda)(w^H v) = 0. Because λμ\lambda \neq \mu, then wHv=0w^H v = 0, which means the eigenvectors are orthogonal.

For real symmetric matrices, we have wTv=0w^T v = 0.

This orthogonality property allows us to diagonalize symmetric and Hermitian matrices using orthogonal or unitary matrices.