Ch05. Eigenvalues and Eigenvectors
5.3 Diagonalization
- In many cases, the eigenvalue–eigenvector information contained within a matrix can be displayed in a useful factorization of the form where is a diagonal matrix.
- In this section, the factorization enables us to compute quickly for large values of , a fundamental idea in several applications of linear algebra.
- Later, in Sections 5.6 and 5.7, the factorization will be used to analyze (and decouple) dynamical systems.
Example 2:
Let . Find a formula for , given that , where
Solution:
The standard formula for the inverse of a matrix yields
- Then, by associativity of matrix multiplication,
- Again,
- In general, for ,
A square matrix is said to be diagonalizable
- if is similar to a diagonal matrix, that is,
- if for some invertible matrix and some diagonal matrix .
- Example 2는 diagnoalization(혹은 factorization)의 효용성을 보여줌.
- Theorem 5는 diagonalizable matrix의 properties와 어떻게 factorization하는지를 알려줌.
Theorem 5: The Diagonalization Theorem
An matrix is diagonalizable if and only if has linearly independent eigenvectors.
- In fact, with a diagonal matrix, if and only if
- the columns of are linearly independent eigenvectors of .
- In this case, the diagonal entries of are eigenvalues of that correspond, respectively, to the eigenvectors in .
- In other words, is diagonalizable if and only if
- there are enough eigenvectors to form a basis of .
- We call such a basis an eigenvector basis of .
Proof:
First, observe that if is any matrix with columns and if is any diagonal matrix with diagonal entries , then while
Now suppose is diagonalizable and . Then right-multiplying this relation by , we have
- In this case, equation (1) and (2) imply that
- Equating columns, we find that
- Since is invertible, its columns must be linear independnet.
- Also, since these columns are nonzero, the equations in (4) show that are eigenvalues and are corresponding eigenvectors.
- This argument proves the “only if ” parts of the first and second statements, along with the third statement, of the theorem.
- Finally, given any eigenvectors , use them to construct the columns of and use corresponding eigenvalues to construct .
- By equation (1)-(3), .
- This is true without any condition on the eigenvectors.
- If, in fact, the eigenvectors are linearly independent, then is invertible (by the Invertible Matrix Theorem), and implies that .
Diagonalizaing Matrices
Example 3:
Diagonlize the following matrix, if possible.
That is, find an invetible matrix and a diagonal matrix such that .
Solution:
There are four steps to implement the description in Theorem 5.
- Step 1. Find the eigenvalues of .
- Here, the characteristic equation turns out to involve a cubic polynomial that can be factored:
- The eigenvalues are .
- Find three linearly independent eigenvectors of .
- Three vectors are needed because is a matrix.
- This is a critical step.
- If it fails, then Theorem 5 says that cannot be diagonalized.
- Basis for
- Basis for
- You can check that is a linearly independent set.
- Step 3. Construct from the vectors in step 2.
- The order of the vectors is unimportant.
- Using the order chosen in step 2, form
- Step 4. Construct D from the corresponding eigenvalues.
- In this step, it is essential that the order of the eigenvalues matches the order chosen for the columns of .
- Use the eigenvalue twice, once for each of the eigenvectors corresponding to :
- To avoid computing , simply verify that .
- Compute
Theorem 6:
An matrix to have distinct eigenvalues is diagonalizable.
Proof:
Let be eigenvectors corresponding to the distinct eigenvalues of a matrix .
Matrices Whose Eigenvalues Are Not Distinct
- It is not necessary for an matrix to have distinct eigenvalues in order to be diagonalizable.
- The matrix in Example 3 is diagonalizable even though it has only two distinct eigenvalues.
- If an matrix has distinct eigenvalues, with corresponding eigenvectors and if , then is automatically invertible because its columns are linearly independent, by Theorem 2.
- When is diagonalizable but has fewer than distinct eigenvalues, it is still possible to build in a way that makes automatically invertible, as the next theorem shows.
Theorem 7:
Let be an matrix whose distinct eigenvalues are .
- For , the dimension of the eigenspace for is less than or equal to the multiplicity(중복도) of the eigenvalue .
- The matrix is diagonalizable if and only if the sum of the dimensions of the eigenspaces equals , and this happens if and only if (i) the characteristic polynomial factors completely into linear factors and (ii) the dimension of the eigenspace for each equals the multiplicity of .
- If is diagonalizable and is a basis for the eigenspace corresponding to for each , then the total collection of vectors in the sets forms an eigenvector basis for $$\mathbb{R}^n.