Let be a finite-dimensional vector space and a linear map. According to The matrix of a linear map, for each choice of a basis for , the map is determined by the matrix .
We will look for a basis such that the structure of the matrix will be computationally simple. The simplest structure is the diagonal matrix. As we will see soon, it is not always possible to find one.
The diagonal of a matrix is the sequence of its -entries; that is, those entries whose column numbers equal their row numbers.
A square matrix has diagonal form (or: is a diagonal matrix) if all of its -entries with are equal to zero.
The matrix of multiplication by a scalar on is diagonal. All of its diagonal entries are equal to .
The matrix
is not diagonal. But the linear map
has diagonal matrix
with respect to the basis
Indeed, the coordinate map corresponding to
is
(the inverse of the matrix whose columns are the vectors of the basis
), so
The following statement is easy to understand but of crucial importance:
Let be a linear map and a basis for the vector space .
The matrix has the diagonal form
if and only if
for
.
The matrix has the above diagonal form if and only if for every the -coordinates of are equal to , so if and only if .
Later we will study in greater detail when a linear map has a basis with respect to which the matrix is diagonal. Here, it makes a difference whether we work with a complex or a real vector space. In order to see this, consider the scalar multiplication by on the -dimensional complex vector space . As we discussed above for , the complex -matrix of this linear map with respect to the basis is diagonal. We can also view as the -dimensional real vector space with basis . In this case multiplication by is still a linear map. It has matrix
with respect to the given basis. We will see below (in the example of a two-dimensional rotation) that there is no basis of the corresponding
-dimensional real vector space with respect to which the matrix of this linear map is a diagonal matrix.
The diagonal form is related to the following notions.
Let be a linear map. A vector is called an eigenvector of with eigenvalue if .
If is an -matrix, then a vector of is called an eigenvector of if it is an eigenvector of and a number is called an eigenvalue of if it is an eigenvalue of .
Thus, an eigenvector is a vector distinct from the zero vector that is mapped by onto a scalar multiple of itself; the scalar involved is the corresponding eigenvalue.
Consider the matrix . The linear map has eigenvector with eigenvalue . Every other eigenvector of is in the span of . In particular, there is no basis of eigenvectors.
If is an eigenvector of a linear map, then so is every multiple of by a nonzero scalar. Thus, eigenvectors corresponding to a fixed linear map and a fixed eigenvalue are never unique.
The above theorem can also be formulated as follows:
Let be a linear map, where is a vector space of finite dimension and fix a basis for .
The matrix is diagonal if and only if is a basis of eigenvectors. In the latter case, the eigenvalues appear on the diagonal.
Consider with standard dot product and the orthogonal projection on a line through the origin spanned by . If is a vector perpendicular to , then the vectors and are eigenvectors of the projection with eigenvalues and , respectively. The matrix with respect to the basis is
which is a diagonal matrix with the eigenvalues on the diagonal. Note the order of
and
on the diagonal: this corresponds to the order of the eigenvectors.
In , consider a rotation about the origin with an angle of (anti-clockwise, so clockwise). The corresponding matrix with respect to the standard basis is
Not a single vector distinct from
is mapped to a scalar multiple of itself. After all, if
would be an eigenvector with eigenvalue
, then
This gives the two linear equations
(for the first coordinate) and
(for the second coordinate), leading to
. If
, then
, such that
is not an eigenvector. We can therefore assume
. Now
; but this is impossible for a real number
.
This linear map has no eigenvectors and there certainly is no basis of eigenvectors. For no choice of basis whatsoever, will the matrix of this rotation be diagonal.
This is immediate from the above theorem Recognition of diagonal form.
Consider the linear map
on the vector space
of polynomials in
with degree at most
given by
The basis
for
consists of eigenvectors of
. What are the eigenvalues of
corresponding to these eigenvectors?
Give your answer in the form of a list of length
.
This follows from the following calculation, where
: