Command Palette

Search for a command to run...

Linear Methods of AI

Eigenvalues, Eigenvectors, and Eigenspaces

Definition of Fundamental Concepts

In linear algebra, we are often interested in special vectors that have unique properties when multiplied by matrices. Imagine vectors that are only "stretched" or "shortened" by the matrix, but their direction remains unchanged.

Let AKn×nA \in \mathbb{K}^{n \times n} be a square matrix. An eigenvector vKnv \in \mathbb{K}^n for an eigenvalue λK\lambda \in \mathbb{K} is a non-zero vector v0v \neq 0 that satisfies:

Av=λvA \cdot v = \lambda \cdot v

This equation shows that when matrix AA operates on vector vv, the result is a scalar multiple of the same vector.

By definition, eigenvalues can equal 0, but eigenvectors are always non-zero.

Basic Properties of Eigenvectors

Eigenvectors have fundamental properties that are very useful in various mathematical applications.

Scalar Multiplication: Let AKn×nA \in \mathbb{K}^{n \times n} and vKnv \in \mathbb{K}^n with v0v \neq 0 be an eigenvector of AA for eigenvalue λK\lambda \in \mathbb{K}. Then all multiples tvt \cdot v with t0t \neq 0 are also eigenvectors of AA for the same eigenvalue λ\lambda.

Why is this true? A(tv)=t(Av)=t(λv)=λ(tv)A \cdot (t \cdot v) = t \cdot (A \cdot v) = t \cdot (\lambda \cdot v) = \lambda \cdot (t \cdot v)

This property shows that if we find one eigenvector, then all its non-zero multiples are also eigenvectors for the same eigenvalue.

Examples of Eigenvector Calculations

Let's look at some concrete examples to better understand this concept:

Diagonal Matrix

For matrix A=(1002)A = \begin{pmatrix} 1 & 0 \\ 0 & 2 \end{pmatrix}:

v1=(10)v_1 = \begin{pmatrix} 1 \\ 0 \end{pmatrix} is an eigenvector for eigenvalue λ1=1\lambda_1 = 1, because Av1=(10)=1v1A \cdot v_1 = \begin{pmatrix} 1 \\ 0 \end{pmatrix} = 1 \cdot v_1

v2=(01)v_2 = \begin{pmatrix} 0 \\ 1 \end{pmatrix} is an eigenvector for eigenvalue λ2=2\lambda_2 = 2, because Av2=(02)=2v2A \cdot v_2 = \begin{pmatrix} 0 \\ 2 \end{pmatrix} = 2 \cdot v_2

Symmetric Matrix

For matrix A=(3113)A = \begin{pmatrix} 3 & 1 \\ 1 & 3 \end{pmatrix}:

v1=(11)v_1 = \begin{pmatrix} -1 \\ 1 \end{pmatrix} is an eigenvector for eigenvalue λ1=2\lambda_1 = 2, because Av1=(22)=2v1A \cdot v_1 = \begin{pmatrix} -2 \\ 2 \end{pmatrix} = 2 \cdot v_1

v2=(11)v_2 = \begin{pmatrix} 1 \\ 1 \end{pmatrix} is an eigenvector for eigenvalue λ2=4\lambda_2 = 4, because Av2=(44)=4v2A \cdot v_2 = \begin{pmatrix} 4 \\ 4 \end{pmatrix} = 4 \cdot v_2

Linear Independence of Eigenvectors

One important result in eigenvector theory is that eigenvectors corresponding to different eigenvalues are always linearly independent.

There's a very important result about linear independence of eigenvectors. Let AKn×nA \in \mathbb{K}^{n \times n} and λ1,,λkK\lambda_1, \ldots, \lambda_k \in \mathbb{K} be pairwise distinct eigenvalues of AA, that is λiλj\lambda_i \neq \lambda_j for iji \neq j with i,j{1,,k}i, j \in \{1, \ldots, k\}. Then the corresponding eigenvectors v1,,vkKnv_1, \ldots, v_k \in \mathbb{K}^n are linearly independent.

This theorem can be proven using mathematical induction and has the important consequence that an n×nn \times n matrix has at most nn distinct eigenvalues.

Eigenspaces and Geometric Multiplicity

For each eigenvalue, we can define a vector space consisting of all eigenvectors corresponding to that eigenvalue.

Let AKn×nA \in \mathbb{K}^{n \times n} and λK\lambda \in \mathbb{K}. The set:

EigA(λ)={vKn:Av=λv}\text{Eig}_A(\lambda) = \{v \in \mathbb{K}^n : A \cdot v = \lambda \cdot v\}

is called the eigenspace of AA for eigenvalue λ\lambda. The dimension of EigA(λ)\text{Eig}_A(\lambda):

dimEigA(λ)\dim \text{Eig}_A(\lambda)

is called the geometric multiplicity of eigenvalue λ\lambda of AA.

Properties of Eigenspaces

Eigenspaces have several important properties:

  1. Zero vector is not an eigenvector: The zero vector is not an eigenvector, but it is an element of EigA(λ)\text{Eig}_A(\lambda)

  2. Set of eigenvectors: EigA(λ){0}\text{Eig}_A(\lambda) \setminus \{0\} is the set of all eigenvectors of AA corresponding to λ\lambda

  3. Eigenvalue condition: λ\lambda is an eigenvalue of AA if and only if EigA(λ){0}\text{Eig}_A(\lambda) \neq \{0\}

  4. Dimension bound: 0dimEigA(λ)n0 \leq \dim \text{Eig}_A(\lambda) \leq n

  5. Relationship with kernel: EigA(0)={vKn:Av=0}=kerA\text{Eig}_A(0) = \{v \in \mathbb{K}^n : A \cdot v = 0\} = \ker A

  6. General eigenspace: EigA(λ)={vKn:(AλI)v=0}=ker(AλI)\text{Eig}_A(\lambda) = \{v \in \mathbb{K}^n : (A - \lambda \cdot I) \cdot v = 0\} = \ker(A - \lambda \cdot I)

  7. Intersection of eigenspaces: If λ1λ2\lambda_1 \neq \lambda_2, then EigA(λ1)EigA(λ2)={0}\text{Eig}_A(\lambda_1) \cap \text{Eig}_A(\lambda_2) = \{0\}

Relationship with Invertibility

Eigenvalues have a close relationship with the invertibility property of matrices.

Now, let's look at an interesting relationship between invertibility and eigenvalues. Matrix AKn×nA \in \mathbb{K}^{n \times n} is invertible if and only if all eigenvalues λK\lambda \in \mathbb{K} of AA satisfy λ0\lambda \neq 0.

Why is this true? AA is invertible if and only if RankA=n\text{Rank} A = n, which means EigA(0)=kerA={0}\text{Eig}_A(0) = \ker A = \{0\}, so λ=0\lambda = 0 is not an eigenvalue of AA.

Eigenvalues of Inverse Matrix

If matrix AA is invertible and v0v \neq 0 is an eigenvector of AA for eigenvalue λK\lambda \in \mathbb{K}, then vv is also an eigenvector of A1A^{-1} for eigenvalue 1λ\frac{1}{\lambda}.

Why is this true? From Av=λvA \cdot v = \lambda \cdot v, by multiplying A1A^{-1} and 1λ\frac{1}{\lambda}, we get 1λv=A1v\frac{1}{\lambda} \cdot v = A^{-1} \cdot v.

Invertibility Criteria

Let's look at various ways to determine whether a matrix is invertible. For a square matrix AKn×nA \in \mathbb{K}^{n \times n}, the following statements are equivalent:

  1. AA is invertible
  2. There exists a matrix A1Kn×nA^{-1} \in \mathbb{K}^{n \times n} with AA1=I=A1AA \cdot A^{-1} = I = A^{-1} \cdot A
  3. AA has full rank, RankA=n\text{Rank} A = n or kerA={0}\ker A = \{0\}
  4. The columns of AA are linearly independent
  5. The rows of AA are linearly independent
  6. detA0\det A \neq 0
  7. All eigenvalues of AA are not equal to 0

This theorem provides various equivalent ways to check whether a matrix is invertible, with eigenvalues being one of the very useful criteria.