Definition of Fundamental Concepts
In linear algebra, we are often interested in special vectors that have unique properties when multiplied by matrices. Imagine vectors that are only "stretched" or "shortened" by the matrix, but their direction remains unchanged.
Let be a square matrix. An eigenvector for an eigenvalue is a non-zero vector that satisfies:
This equation shows that when matrix operates on vector , the result is a scalar multiple of the same vector.
By definition, eigenvalues can equal 0, but eigenvectors are always non-zero.
Basic Properties of Eigenvectors
Eigenvectors have fundamental properties that are very useful in various mathematical applications.
Scalar Multiplication: Let and with be an eigenvector of for eigenvalue . Then all multiples with are also eigenvectors of for the same eigenvalue .
Why is this true?
This property shows that if we find one eigenvector, then all its non-zero multiples are also eigenvectors for the same eigenvalue.
Examples of Eigenvector Calculations
Let's look at some concrete examples to better understand this concept:
Diagonal Matrix
For matrix :
is an eigenvector for eigenvalue , because
is an eigenvector for eigenvalue , because
Symmetric Matrix
For matrix :
is an eigenvector for eigenvalue , because
is an eigenvector for eigenvalue , because
Linear Independence of Eigenvectors
One important result in eigenvector theory is that eigenvectors corresponding to different eigenvalues are always linearly independent.
There's a very important result about linear independence of eigenvectors. Let and be pairwise distinct eigenvalues of , that is for with . Then the corresponding eigenvectors are linearly independent.
This theorem can be proven using mathematical induction and has the important consequence that an matrix has at most distinct eigenvalues.
Eigenspaces and Geometric Multiplicity
For each eigenvalue, we can define a vector space consisting of all eigenvectors corresponding to that eigenvalue.
Let and . The set:
is called the eigenspace of for eigenvalue . The dimension of :
is called the geometric multiplicity of eigenvalue of .
Properties of Eigenspaces
Eigenspaces have several important properties:
-
Zero vector is not an eigenvector: The zero vector is not an eigenvector, but it is an element of
-
Set of eigenvectors: is the set of all eigenvectors of corresponding to
-
Eigenvalue condition: is an eigenvalue of if and only if
-
Dimension bound:
-
Relationship with kernel:
-
General eigenspace:
-
Intersection of eigenspaces: If , then
Relationship with Invertibility
Eigenvalues have a close relationship with the invertibility property of matrices.
Now, let's look at an interesting relationship between invertibility and eigenvalues. Matrix is invertible if and only if all eigenvalues of satisfy .
Why is this true? is invertible if and only if , which means , so is not an eigenvalue of .
Eigenvalues of Inverse Matrix
If matrix is invertible and is an eigenvector of for eigenvalue , then is also an eigenvector of for eigenvalue .
Why is this true? From , by multiplying and , we get .
Invertibility Criteria
Let's look at various ways to determine whether a matrix is invertible. For a square matrix , the following statements are equivalent:
- is invertible
- There exists a matrix with
- has full rank, or
- The columns of are linearly independent
- The rows of are linearly independent
- All eigenvalues of are not equal to 0
This theorem provides various equivalent ways to check whether a matrix is invertible, with eigenvalues being one of the very useful criteria.