# Nakafa Framework: LLM
URL: /en/subject/university/bachelor/ai-ds/linear-methods/eigenvalue-eigenvector-eigenspace
Source: https://raw.githubusercontent.com/nakafaai/nakafa.com/refs/heads/main/packages/contents/subject/university/bachelor/ai-ds/linear-methods/eigenvalue-eigenvector-eigenspace/en.mdx
Output docs content for large language models.
---
export const metadata = {
    title: "Eigenvalues, Eigenvectors, and Eigenspaces",
    description: "Master fundamental eigenvalue concepts: understand how matrices transform vectors, calculate eigenvectors, and explore eigenspace properties.",
    authors: [{ name: "Nabil Akbarazzima Fatih" }],
    date: "07/12/2025",
    subject: "Linear Methods of AI",
};
## Definition of Fundamental Concepts
In linear algebra, we are often interested in special vectors that have unique properties when multiplied by matrices. Imagine vectors that are only "stretched" or "shortened" by the matrix, but their direction remains unchanged.
Let  be a square matrix. An **eigenvector**  for an **eigenvalue**  is a non-zero vector  that satisfies:
This equation shows that when matrix  operates on vector , the result is a scalar multiple of the same vector.
> By definition, eigenvalues can equal 0, but eigenvectors are always non-zero.
## Basic Properties of Eigenvectors
Eigenvectors have fundamental properties that are very useful in various mathematical applications.
**Scalar Multiplication**: Let  and  with  be an eigenvector of  for eigenvalue . Then all multiples  with  are also eigenvectors of  for the same eigenvalue .
Why is this true? 
This property shows that if we find one eigenvector, then all its non-zero multiples are also eigenvectors for the same eigenvalue.
## Examples of Eigenvector Calculations
Let's look at some concrete examples to better understand this concept:
### Diagonal Matrix
For matrix :
 is an eigenvector for eigenvalue , because 
 is an eigenvector for eigenvalue , because 
### Symmetric Matrix
For matrix :
 is an eigenvector for eigenvalue , because 
 is an eigenvector for eigenvalue , because 
## Linear Independence of Eigenvectors
One important result in eigenvector theory is that eigenvectors corresponding to different eigenvalues are always linearly independent.
There's a very important result about linear independence of eigenvectors. Let  and  be pairwise distinct eigenvalues of , that is  for  with . Then the corresponding eigenvectors  are linearly independent.
This theorem can be proven using mathematical induction and has the important consequence that an  matrix has at most  distinct eigenvalues.
## Eigenspaces and Geometric Multiplicity
For each eigenvalue, we can define a vector space consisting of all eigenvectors corresponding to that eigenvalue.
Let  and . The set:
is called the **eigenspace** of  for eigenvalue . The dimension of :
is called the **geometric multiplicity** of eigenvalue  of .
### Properties of Eigenspaces
Eigenspaces have several important properties:
1. **Zero vector is not an eigenvector**: The zero vector is not an eigenvector, but it is an element of 
2. **Set of eigenvectors**:  is the set of all eigenvectors of  corresponding to 
3. **Eigenvalue condition**:  is an eigenvalue of  if and only if 
4. **Dimension bound**: 
5. **Relationship with kernel**: 
6. **General eigenspace**: 
7. **Intersection of eigenspaces**: If , then 
## Relationship with Invertibility
Eigenvalues have a close relationship with the invertibility property of matrices.
Now, let's look at an interesting relationship between invertibility and eigenvalues. Matrix  is invertible if and only if all eigenvalues  of  satisfy .
Why is this true?  is invertible if and only if , which means , so  is not an eigenvalue of .
### Eigenvalues of Inverse Matrix
If matrix  is invertible and  is an eigenvector of  for eigenvalue , then  is also an eigenvector of  for eigenvalue .
Why is this true? From , by multiplying  and , we get .
## Invertibility Criteria
Let's look at various ways to determine whether a matrix is invertible. For a square matrix , the following statements are equivalent:
1.  is invertible
2. There exists a matrix  with 
3.  has full rank,  or 
4. The columns of  are linearly independent
5. The rows of  are linearly independent
6. 
7. All eigenvalues of  are not equal to 0
This theorem provides various equivalent ways to check whether a matrix is invertible, with eigenvalues being one of the very useful criteria.