• Nakafa

    Nakafa

    Learn free and with quality.
Subject
    • Bachelor
Exercises
Holy
  • Quran
Articles
  • Politics
  • Community
  • About

Command Palette

Search for a command to run...

Linear Methods of AI

Eigenvalues, Eigenvectors, and Eigenspaces

Definition of Fundamental Concepts

In linear algebra, we are often interested in special vectors that have unique properties when multiplied by matrices. Imagine vectors that are only "stretched" or "shortened" by the matrix, but their direction remains unchanged.

Let A∈Kn×nA \in \mathbb{K}^{n \times n}A∈Kn×n be a square matrix. An eigenvector v∈Knv \in \mathbb{K}^nv∈Kn for an eigenvalue λ∈K\lambda \in \mathbb{K}λ∈K is a non-zero vector v≠0v \neq 0v=0 that satisfies:

A⋅v=λ⋅vA \cdot v = \lambda \cdot vA⋅v=λ⋅v

This equation shows that when matrix AAA operates on vector vvv, the result is a scalar multiple of the same vector.

By definition, eigenvalues can equal 0, but eigenvectors are always non-zero.

Basic Properties of Eigenvectors

Eigenvectors have fundamental properties that are very useful in various mathematical applications.

Scalar Multiplication: Let A∈Kn×nA \in \mathbb{K}^{n \times n}A∈Kn×n and v∈Knv \in \mathbb{K}^nv∈Kn with v≠0v \neq 0v=0 be an eigenvector of AAA for eigenvalue λ∈K\lambda \in \mathbb{K}λ∈K. Then all multiples t⋅vt \cdot vt⋅v with t≠0t \neq 0t=0 are also eigenvectors of AAA for the same eigenvalue λ\lambdaλ.

Why is this true? A⋅(t⋅v)=t⋅(A⋅v)=t⋅(λ⋅v)=λ⋅(t⋅v)A \cdot (t \cdot v) = t \cdot (A \cdot v) = t \cdot (\lambda \cdot v) = \lambda \cdot (t \cdot v)A⋅(t⋅v)=t⋅(A⋅v)=t⋅(λ⋅v)=λ⋅(t⋅v)

This property shows that if we find one eigenvector, then all its non-zero multiples are also eigenvectors for the same eigenvalue.

Examples of Eigenvector Calculations

Let's look at some concrete examples to better understand this concept:

Diagonal Matrix

For matrix A=(1002)A = \begin{pmatrix} 1 & 0 \\ 0 & 2 \end{pmatrix}A=(10​02​):

v1=(10)v_1 = \begin{pmatrix} 1 \\ 0 \end{pmatrix}v1​=(10​) is an eigenvector for eigenvalue λ1=1\lambda_1 = 1λ1​=1, because A⋅v1=(10)=1⋅v1A \cdot v_1 = \begin{pmatrix} 1 \\ 0 \end{pmatrix} = 1 \cdot v_1A⋅v1​=(10​)=1⋅v1​

v2=(01)v_2 = \begin{pmatrix} 0 \\ 1 \end{pmatrix}v2​=(01​) is an eigenvector for eigenvalue λ2=2\lambda_2 = 2λ2​=2, because A⋅v2=(02)=2⋅v2A \cdot v_2 = \begin{pmatrix} 0 \\ 2 \end{pmatrix} = 2 \cdot v_2A⋅v2​=(02​)=2⋅v2​

Symmetric Matrix

For matrix A=(3113)A = \begin{pmatrix} 3 & 1 \\ 1 & 3 \end{pmatrix}A=(31​13​):

v1=(−11)v_1 = \begin{pmatrix} -1 \\ 1 \end{pmatrix}v1​=(−11​) is an eigenvector for eigenvalue λ1=2\lambda_1 = 2λ1​=2, because A⋅v1=(−22)=2⋅v1A \cdot v_1 = \begin{pmatrix} -2 \\ 2 \end{pmatrix} = 2 \cdot v_1A⋅v1​=(−22​)=2⋅v1​

v2=(11)v_2 = \begin{pmatrix} 1 \\ 1 \end{pmatrix}v2​=(11​) is an eigenvector for eigenvalue λ2=4\lambda_2 = 4λ2​=4, because A⋅v2=(44)=4⋅v2A \cdot v_2 = \begin{pmatrix} 4 \\ 4 \end{pmatrix} = 4 \cdot v_2A⋅v2​=(44​)=4⋅v2​

Linear Independence of Eigenvectors

One important result in eigenvector theory is that eigenvectors corresponding to different eigenvalues are always linearly independent.

There's a very important result about linear independence of eigenvectors. Let A∈Kn×nA \in \mathbb{K}^{n \times n}A∈Kn×n and λ1,…,λk∈K\lambda_1, \ldots, \lambda_k \in \mathbb{K}λ1​,…,λk​∈K be pairwise distinct eigenvalues of AAA, that is λi≠λj\lambda_i \neq \lambda_jλi​=λj​ for i≠ji \neq ji=j with i,j∈{1,…,k}i, j \in \{1, \ldots, k\}i,j∈{1,…,k}. Then the corresponding eigenvectors v1,…,vk∈Knv_1, \ldots, v_k \in \mathbb{K}^nv1​,…,vk​∈Kn are linearly independent.

This theorem can be proven using mathematical induction and has the important consequence that an n×nn \times nn×n matrix has at most nnn distinct eigenvalues.

Eigenspaces and Geometric Multiplicity

For each eigenvalue, we can define a vector space consisting of all eigenvectors corresponding to that eigenvalue.

Let A∈Kn×nA \in \mathbb{K}^{n \times n}A∈Kn×n and λ∈K\lambda \in \mathbb{K}λ∈K. The set:

EigA(λ)={v∈Kn:A⋅v=λ⋅v}\text{Eig}_A(\lambda) = \{v \in \mathbb{K}^n : A \cdot v = \lambda \cdot v\}EigA​(λ)={v∈Kn:A⋅v=λ⋅v}

is called the eigenspace of AAA for eigenvalue λ\lambdaλ. The dimension of EigA(λ)\text{Eig}_A(\lambda)EigA​(λ):

dim⁡EigA(λ)\dim \text{Eig}_A(\lambda)dimEigA​(λ)

is called the geometric multiplicity of eigenvalue λ\lambdaλ of AAA.

Properties of Eigenspaces

Eigenspaces have several important properties:

  1. Zero vector is not an eigenvector: The zero vector is not an eigenvector, but it is an element of EigA(λ)\text{Eig}_A(\lambda)EigA​(λ)

  2. Set of eigenvectors: EigA(λ)∖{0}\text{Eig}_A(\lambda) \setminus \{0\}EigA​(λ)∖{0} is the set of all eigenvectors of AAA corresponding to λ\lambdaλ

  3. Eigenvalue condition: λ\lambdaλ is an eigenvalue of AAA if and only if EigA(λ)≠{0}\text{Eig}_A(\lambda) \neq \{0\}EigA​(λ)={0}

  4. Dimension bound: 0≤dim⁡EigA(λ)≤n0 \leq \dim \text{Eig}_A(\lambda) \leq n0≤dimEigA​(λ)≤n

  5. Relationship with kernel: EigA(0)={v∈Kn:A⋅v=0}=ker⁡A\text{Eig}_A(0) = \{v \in \mathbb{K}^n : A \cdot v = 0\} = \ker AEigA​(0)={v∈Kn:A⋅v=0}=kerA

  6. General eigenspace: EigA(λ)={v∈Kn:(A−λ⋅I)⋅v=0}=ker⁡(A−λ⋅I)\text{Eig}_A(\lambda) = \{v \in \mathbb{K}^n : (A - \lambda \cdot I) \cdot v = 0\} = \ker(A - \lambda \cdot I)EigA​(λ)={v∈Kn:(A−λ⋅I)⋅v=0}=ker(A−λ⋅I)

  7. Intersection of eigenspaces: If λ1≠λ2\lambda_1 \neq \lambda_2λ1​=λ2​, then EigA(λ1)∩EigA(λ2)={0}\text{Eig}_A(\lambda_1) \cap \text{Eig}_A(\lambda_2) = \{0\}EigA​(λ1​)∩EigA​(λ2​)={0}

Relationship with Invertibility

Eigenvalues have a close relationship with the invertibility property of matrices.

Now, let's look at an interesting relationship between invertibility and eigenvalues. Matrix A∈Kn×nA \in \mathbb{K}^{n \times n}A∈Kn×n is invertible if and only if all eigenvalues λ∈K\lambda \in \mathbb{K}λ∈K of AAA satisfy λ≠0\lambda \neq 0λ=0.

Why is this true? AAA is invertible if and only if RankA=n\text{Rank} A = nRankA=n, which means EigA(0)=ker⁡A={0}\text{Eig}_A(0) = \ker A = \{0\}EigA​(0)=kerA={0}, so λ=0\lambda = 0λ=0 is not an eigenvalue of AAA.

Eigenvalues of Inverse Matrix

If matrix AAA is invertible and v≠0v \neq 0v=0 is an eigenvector of AAA for eigenvalue λ∈K\lambda \in \mathbb{K}λ∈K, then vvv is also an eigenvector of A−1A^{-1}A−1 for eigenvalue 1λ\frac{1}{\lambda}λ1​.

Why is this true? From A⋅v=λ⋅vA \cdot v = \lambda \cdot vA⋅v=λ⋅v, by multiplying A−1A^{-1}A−1 and 1λ\frac{1}{\lambda}λ1​, we get 1λ⋅v=A−1⋅v\frac{1}{\lambda} \cdot v = A^{-1} \cdot vλ1​⋅v=A−1⋅v.

Invertibility Criteria

Let's look at various ways to determine whether a matrix is invertible. For a square matrix A∈Kn×nA \in \mathbb{K}^{n \times n}A∈Kn×n, the following statements are equivalent:

  1. AAA is invertible
  2. There exists a matrix A−1∈Kn×nA^{-1} \in \mathbb{K}^{n \times n}A−1∈Kn×n with A⋅A−1=I=A−1⋅AA \cdot A^{-1} = I = A^{-1} \cdot AA⋅A−1=I=A−1⋅A
  3. AAA has full rank, RankA=n\text{Rank} A = nRankA=n or ker⁡A={0}\ker A = \{0\}kerA={0}
  4. The columns of AAA are linearly independent
  5. The rows of AAA are linearly independent
  6. det⁡A≠0\det A \neq 0detA=0
  7. All eigenvalues of AAA are not equal to 0

This theorem provides various equivalent ways to check whether a matrix is invertible, with eigenvalues being one of the very useful criteria.

Previous

Complex Matrix

Next

Characteristic Polynomial

  • Eigenvalues, Eigenvectors, and EigenspacesMaster fundamental eigenvalue concepts: understand how matrices transform vectors, calculate eigenvectors, and explore eigenspace properties.
On this page
  • Definition of Fundamental Concepts
  • Basic Properties of Eigenvectors
  • Examples of Eigenvector Calculations
    • Diagonal Matrix
    • Symmetric Matrix
  • Linear Independence of Eigenvectors
  • Eigenspaces and Geometric Multiplicity
    • Properties of Eigenspaces
  • Relationship with Invertibility
    • Eigenvalues of Inverse Matrix
  • Invertibility Criteria
  • Comments
  • Report
  • Source code