Definition of Identifiability
Identifiability determines whether all parameters in a model can be uniquely determined from available data. Imagine a detective trying to identify suspects from available clues. If the clues are sufficient and not contradictory, identification can be done with certainty.
For matrix , vector , with , the least squares problem
aims to estimate parameter through the corresponding normal equation system
When this system has a unique solution, all parameters can be identified.
Full Identifiability Condition
All parameters can be identified precisely when matrix has full rank .
Mathematically, this condition can be written as
The full rank condition is like ensuring that each parameter provides truly new and non-overlapping information. Similar to a detective case where there are sufficient independent clues to identify each suspect without confusion. Each parameter provides information that cannot be obtained from other parameters, making the estimation unique and stable.
Matrix Rank Determination
The rank of matrix can be obtained during the computational process of QR decomposition or LU decomposition of matrix . However, a more computationally expensive but numerically more stable approach is to determine the rank using singular value decomposition of .
The difference between these two approaches is like comparing measurement with a regular ruler versus measurement with a high-precision instrument. Singular value decomposition provides more detailed and robust information about the numerical structure of matrices, especially for cases approaching singularity.
Singular Value Decomposition
For matrix with , there exist orthogonal matrices and as well as matrix with for all and non-negative diagonal entries , such that
This representation is called the singular value decomposition of . The values are called singular values of . Matrices and are not uniquely determined.
This decomposition is like dismantling a complex machine into its basic components. We can see how the matrix transforms vector space, including the main transformation directions and how much scaling occurs in each direction.
Relationship Between Singular Values and Rank
The number of non-zero singular values of matrix equals .
Mathematically, this means
where denotes the number of elements in the set.
This fundamental property provides a numerically stable way to determine matrix rank. Very small singular values are like weak radio signals, still present but barely detectable.
Rank-Deficient Condition
The term "rank-deficient" refers to the condition when a matrix does not have full rank. That is, . In this context, some rows or columns of the matrix are linearly dependent.
When a matrix is rank-deficient, some singular values become zero or very close to zero
This condition indicates that the equation system has more than one solution or may not even have a unique solution. In numerical practice, we often use a threshold to determine whether a singular value is considered zero
where typically ranges between to depending on computational precision.
Singular Value Decomposition Computation
Singular value decomposition can be computed using eigenvalues and eigenvectors of . The mathematical relationship is
where is a diagonal matrix that has values on the main diagonal and zeros elsewhere
In numerical libraries, special functions are available for this computation called SVD (singular value decomposition). SVD implementations in modern numerical libraries use highly efficient and stable algorithms, making them reliable tools for various applications in matrix analysis and scientific computing.