Previously we saw that, if an orthogonal map leaves invariant a subspace, it also leaves invariant a complementary subspace. The same is true of unitary maps.
Let be a linear subspace of a complex finite-dimensional inner product space , and a unitary map that leaves invariant. Then also is invariant under .
The proof of this statement is very similar to the real case.
Earlier we also saw that orthogonal maps are complex diagonalizable. This also holds for unitary maps.
Let be a unitary map on a complex finite-dimensional inner product space . Then there is an orthonormal basis for such that is a diagonal matrix with only complex numbers of absolute value on the diagonal.
Suppose that is a complex eigenvalue of . We show that the generalized eigenspace for coincides with the eigenspace . For this purpose, it is sufficient to show that is equal to . Let be a vector in and write . Then belongs to , so is an eigenvector of with eigenvalue .
We examine what it means for to preserve the inner product of and :
By subtracting from both sides, we see that . Because , we have , so . Because of the positive definiteness of the inner product, this implies . We conclude that belongs to . This shows that . By induction with respect to the exponent we derive immediately from this that for all natural numbers , so the generalized eigenspace of for is equal to its eigenspace.
According to theorems Direct sum decomposition and Generalized eigenspaces, is the direct sum of the generalized eigenspaces of . Because of the above, is the direct sum of the eigenspaces of . The eigenspaces are perpendicular to each other: If and are eigenvalues with eigenvectors and , respecively, then we have After multiplication by (which has absolute value because it is the eigenvalue of a unitary map), this leads to Therefore, if , then , that is, is perpendicular to .
So, if we choose orthonormal bases for each of the eigenspaces, then , the composition of these orthonormal bases, is an orthonormal basis of and the matrix is diagonal.
According to the theorem, eigenvectors of with different eigenvalues are perpendicular to each other. This makes it easy to find an orthonormal basis of eigenvectors.
Whereas the orthogonal Jordan normal form still may have -submatrices corresponding to rotations, the theorem tells us that unitary Jordan (normal) form of a unitary map is always diagonal. On the diagonal are the eigenvalues, which all have absolute value .
The converse is also true: Suppose that is a linear map on a complex finite-dimensional inner product space . If there is an orthonormal basis for such that is a diagonal matrix with only complex numbers with absolute value on the diagonal, then is unitary.
The absolute value of the determinant of a unitary matrix is equal to . Indeed, the determinant of a matrix does not depend on the basis chosen for the vectorspace of the map determined by that matrix. Thanks to the above theorem, we may choose the basis such that the matrix of a unitary map with respect to is diagonal with only numbers of absolute value on the diagonal. The matrix is conjugate with the original matrix and therefore has the same determinant. The determinant of a diagonal matrix is equal to the product of the diagonal elements, so that the absolute value of the determinant of equals .
The statements about classifications of orthogonal maps have their analogue unitary versions.
Let be a complex inner product space of finite dimension and suppose that and are unitary maps . Then the following statements about and are equivalent:
- There is a unitary map such that .
- There is an invertible linear map such that .
- The characteristic polynomials of and are equal.
- The eigenvalues with multiplicities of are the same as those of .
We prove the implications according to the scheme . This suffices for the proof of all equivalences.
is trivial.
holds because it is known from The characteristic polynomial of a linear mapping that the polynomial does not depend on the choice of a basis.
Statements 3 and 4 are equivalent because the characteristic polynomial is the product of the linear factors , where runs over all eigenvalues (with the appropriate multiplicities).
Suppose that statement 4 holds. From the above theorem Unitary maps are diagonalizable, it follows that there are orthonormal bases and such that and are diagonal matrices. These diagonal matrices for and have the same diagonal entries, namely the eigenvalues. We may assume that so . Put . Because and are orthonormal bases, the corresponding linear transformations are unitary. Because of Properties of isometries, also is unitary. Finally, we have
Thus statement 1 is derived from statement 4.
We say that two square matrices and are unitary conjugate if there is a unitary matrix with . This defines an equivalence relation. The proof of transitivity is similar to the proof of the implication of the theorem.
The matrix is unitary. Give a unitary matrix such that is diagonal.
The characteristic polynomial of is Therefore, the eigenvalues of are and . Because is unitary, it is diagonalizable, so is conjugate to In order to find the requested unitary matrix , we first determine a basis of consisting of eigenvectors of .
The eigenspace of corresponding to is found by the calculating the nullspace of . A spanning vector is . Likewise the eigenspace of corresponding to is found by calculating the nullspace of . A spanning vector is .
The two spanning vectors of the eigenspaces found are perpendicular to each other. We find an orthonormal basis of by dividing these two vectors by their lengths:
Thus, the basis is given by the following matrix whose columns are the two vectors and : The answer is not unique: the columns of may be swapped because the order of the diagonal elements in is irrelevant; furthermore, the columns of may be multiplied by independently of each other because the signs of the basis vectors in are irrelevant.
The answer can be verified by conjugation of by the inverse of the answer and verifying whether this is the diagonal matrix . We carry this check out for the answer :