Command Palette

Search for a command to run...

Linear Methods of AI

Eigenvalues of Diagonal and Triangular Matrices

Diagonal Matrices and Their Special Properties

For diagonal matrices, eigenvalues can be read directly from their main diagonal entries. This is one of the most fascinating features in linear algebra.

The eigenvalues of a square diagonal matrix or triangular matrix AKn×nA \in \mathbb{K}^{n \times n}

A=(a1100ann)A = \begin{pmatrix} a_{11} & & 0 \\ & \ddots & \\ 0 & & a_{nn} \end{pmatrix}
or A=(a110ann)\text{or } A = \begin{pmatrix} a_{11} & * \\ & \ddots & \\ 0 & & a_{nn} \end{pmatrix}
or A=(a110ann)\text{or } A = \begin{pmatrix} a_{11} & & 0 \\ & \ddots & \\ * & & a_{nn} \end{pmatrix}

are its main diagonal entries:

λ1=a11,,λn=ann\lambda_1 = a_{11}, \ldots, \lambda_n = a_{nn}

Why is this true? Since χA(t)=det(AtI)=(a11t)(annt)\chi_A(t) = \det(A - t \cdot I) = (a_{11} - t) \cdots (a_{nn} - t) with roots a11,,anna_{11}, \ldots, a_{nn}.

This property greatly simplifies our work because we don't need to calculate determinants or solve complex characteristic equations.

Upper and Lower Triangular Matrices

Triangular matrices have the same property as diagonal matrices. For both upper and lower triangular matrices, the eigenvalues are still the main diagonal entries.

This happens because when we calculate det(AtI)\det(A - tI), the entries above or below the main diagonal don't affect the determinant calculation. The triangular structure allows the determinant to be computed as the product of diagonal entries.

Direct Calculation Examples

Let's look at some concrete examples to better understand this concept.

Complex Eigenvalues

Suppose A=(1111)A = \begin{pmatrix} 1 & -1 \\ 1 & 1 \end{pmatrix}. Its characteristic polynomial is:

χA(t)=det(1t111t)\chi_A(t) = \det \begin{pmatrix} 1-t & -1 \\ 1 & 1-t \end{pmatrix}
=(1t)(1t)1(1)=t22t+2= (1-t) \cdot (1-t) - 1 \cdot (-1) = t^2 - 2t + 2

which has roots λ1=1+i\lambda_1 = 1 + i and λ2=1i\lambda_2 = 1 - i.

Zero Eigenvalues

For A=(1ii1)A = \begin{pmatrix} 1 & -i \\ i & 1 \end{pmatrix}, the characteristic polynomial is:

χA(t)=det(1tii1t)\chi_A(t) = \det \begin{pmatrix} 1-t & -i \\ i & 1-t \end{pmatrix}
=(1t)(1t)i(i)=t22t= (1-t) \cdot (1-t) - i \cdot (-i) = t^2 - 2t

with roots λ1=2\lambda_1 = 2 and λ2=0\lambda_2 = 0.

Characteristic Polynomial Factorization

When matrix ACn×nA \in \mathbb{C}^{n \times n} has nn eigenvalues that don't have to be distinct λ1,,λnC\lambda_1, \ldots, \lambda_n \in \mathbb{C}, the characteristic polynomial χA(t)\chi_A(t) can be factored as:

χA(t)=(λ1t)(λnt)\chi_A(t) = (\lambda_1 - t) \cdots (\lambda_n - t)

The sum of algebraic multiplicities of all eigenvalues is nn:

λCμA(λ)=n\sum_{\lambda \in \mathbb{C}} \mu_A(\lambda) = n

In a more compact form:

χA(t)=λC(λt)μA(λ)\chi_A(t) = \prod_{\lambda \in \mathbb{C}} (\lambda - t)^{\mu_A(\lambda)}

This property holds naturally for complex eigenvalues of matrices with real entries. Eigenvalues can be real numbers or complex conjugate pairs.

Relationship Between Determinant and Trace

There's a fundamental relationship between eigenvalues and the determinant and trace of a matrix. If the characteristic polynomial χA(t)\chi_A(t) can be factored linearly in K\mathbb{K}, which means matrix AA has nn eigenvalues λ1,,λnK\lambda_1, \ldots, \lambda_n \in \mathbb{K}, then:

detA=i=1nλi\det A = \prod_{i=1}^n \lambda_i
trA=i=1nλi\text{tr} A = \sum_{i=1}^n \lambda_i

The determinant is the product of all eigenvalues, and the trace is the sum of all eigenvalues.

Let's verify with our previous examples:

For A=(1111)A = \begin{pmatrix} 1 & -1 \\ 1 & 1 \end{pmatrix} with λ1=1+i\lambda_1 = 1 + i, λ2=1i\lambda_2 = 1 - i:

detA=111(1)=2=λ1λ2\det A = 1 \cdot 1 - 1 \cdot (-1) = 2 = \lambda_1 \cdot \lambda_2
trA=1+1=2=λ1+λ2\text{tr} A = 1 + 1 = 2 = \lambda_1 + \lambda_2

For A=(1ii1)A = \begin{pmatrix} 1 & -i \\ i & 1 \end{pmatrix} with λ1=2\lambda_1 = 2, λ2=0\lambda_2 = 0:

detA=11i(i)=0=λ1λ2\det A = 1 \cdot 1 - i \cdot (-i) = 0 = \lambda_1 \cdot \lambda_2
trA=1+1=2=λ1+λ2\text{tr} A = 1 + 1 = 2 = \lambda_1 + \lambda_2

This relationship is very useful for verifying calculations and provides geometric insights into the linear transformation represented by the matrix.