We have seen that a linear map of a finite-dimensional real or complex vector space to itself, is not always diagonizable, even if is complex. The problem is that the dimension of the eigenspace of with respect to a root of the characteristic polynomial can be smaller than the multiplicity of that root in . Consider for example and for with characteristic polynomial . The root has multiplicity and the dimension of the eigenspace of at eigenvalue is .
We will treat this second problem in two steps. First we point out an invariant subspace of that can be bigger than the eigenspace at a given root of the characteristic polynomial, but on which the restriction of to that subspace has a characteristic polynomial that is a power of . Later we will indicate a basis for that subspace where the restriction of approximates a diagonal form.
Let be a vector space, a linear map, and an eigenvalue of . The generalized eigenspace of with respect to is the subspace consisting of all vectors of for which there exists a natural number that satisfies . This subspace is invariant under .
To prove that the generalized eigenspace is indeed a linear subspace of , we first notice that belongs to . Next, if lies in then according to the definition there is an integer that satisfies . If is a scalar, then we have showing that belongs to . Finally, let also be a vector in . Then, there must be a natural number satisfying . We now have for from which it follows that belongs to .
With this we have proven that is a linear subspace of .
To prove the invariance of under , we let be a random vector in , and prove that also belongs to . From the definition of it follows that there exists a natural number such that . Hence, belongs to . Because commutes with , we can apply theorem Invariance of kernel and image under commuting linear maps to see that is invariant under , showing that belongs to , which is what we needed to prove.
If and for , then is the only root of the characteristic polynomial . The eigenspace of for eigenvalue is spanned by the standard basis vector and is therefore a proper subspace of , but the generalized eigenspace of at coincides with .
In the finite-dimensional case, the dimension of the generalized eigenspace of is equal to the multiplicity of in the characteristic polynomial of :
Assume that is a finite-dimensional vector space and that is a linear map such that is a root of the characteristic polynomial .
- If has multiplicity in the minimal polynomial , then .
- If has multiplicity in , then and we have .
- Write so that . The sequence is strictly increasing.
Let be the multiplicity of in the minimal polynomial . From the definition of generalized eigenspace it is clear that is enclosed in . Suppose that is a vector in . Then there exists a natural number , such that belongs to . We will show that we can choose . Assume . Because is the multiplicity of in , we can factor the minimal polynomial as
where is a polynomial with . In particular we have
The extended Euclidean algorithm gives polynomials and that satisfy
If we substitute in all polynomials of this equality, we find, thanks to the fact that ,
such that from follows showing that belongs to , such that coincides with . This proves the first statement.
Since the equality in the second statement follows directly from the first. We will also show that . Reasoning for and along the same lines as above for and we find a polynomial with , in such a way that According to Invariant direct sum we have the following direct sum decomposition of into subspaces that are invariant under :We have already seen that the first summand is equal to . According to the theorem determinants of some special matrices the characteristic polynomial is the product of the characteristic polynomials of restricted to each summand. The characteristic polynomial of restricted to is of the form , where , because the minimal polynomial is a divisor of . On the other hand is no divisor of the characteristic polynomial of restricted to , since the corresponding minimal polynomial divides and . We conclude that , such that .
Concerning the third statement: it is obvious that since is enclosed in . To prove that if , we assume that . We claim that from this follows that for each natural number . For this follows from that assumption. We use full induction to prove this for all . For that use and assume that (this is the induction hypothesis). If lies in , then so that lies in . Because and is included in , we have . This means that lies in , so that lies in . But , hence even lies in . Thus, we have deduced that , hence, .
From the statement we have just proven it follows that if , also for all , so . Since is the smallest natural number that satisfies , we see that only holds for so that , which concludes the proof of the theorem.
The strictly increasing sequence shows that
is a sequence of ever increasing invariant subspaces. The first subspace is the eigenspace of with respect to , the last subspace is the generalized eigenspace . In particular, and is the exponent of in the characteristic polynomial of . The inequalities suggest a method for calculating the exponent of in the minimal polynomial: consecutively calculate the numbers , for until . We then have .
The numbers do not depend on a basis for . In other words, for conjugate -matrices and the numbers are equal to . Later we will see that this information uniquely determines the conjugation class of ; meaning: if for an -matrix the values are equal to for all eigenvalues with multiplicity , then and are conjugate.
The multiplicity of an eigenvalue as a zero of the characteristic polynomial, that is, the above number , is called the algebraic multiplicity of in . The dimension of is often called the geometric multiplicity of in .
Consider the matrix The characteristic polynomial of is equal to . Therefore, the eigenvalues of are and .
Determine a basis for the generalized eigenspace of corresponding to the eigenvalue .
Since the multiplicity of in the characteristic polynomial is equal to , the generalized eigenspace coincides with .
The obvious method of determining is as follows: By squaring we find
Next, we compute the kernel of this linear map by solving the system of equations . This leads to the following basis for : Alternatively, we may use theorem
Invariant direct sum according to which the requested basis is also a basis of . This subspace is spanned by the columns of the matrix By thinning we find that the following columns are a basis for :