### Invariant subspaces of linear maps: Eigenvalues and eigenvectors

### Diagonal form

Let #V# be a finite-dimensional vector space and #L :V\rightarrow V# a linear map. According to *The matrix of a linear map*, for each choice of a basis #\alpha# for #V#, the map #L# is determined by the matrix #L_\alpha#.

We will look for a basis #\alpha# such that the structure of the matrix #L_\alpha# will be computationally simple. The simplest structure is the diagonal matrix. As we will see soon, it is not always possible to find one.

Diagonal form

The **diagonal** of a matrix is the sequence of its #(i,i)#-entries; that is, those entries whose column numbers equal their row numbers.

A square matrix #A# has **diagonal form** (or: is a **diagonal matrix**) if all of its #(i,j)#-entries with #i\neq j# are equal to zero.

The following statement is easy to understand but of crucial importance:

Recognition of diagonal form

Let # L :V\rightarrow V# be a linear map and #\alpha =\basis{\vec{a}_1,\ldots ,\vec{a}_n}# a basis for the vector space #V#.

The matrix #L_\alpha# has the diagonal form

\[

L_\alpha =\left(\,\begin{array}{cccc}

\lambda_1 & 0 & \ldots & 0\\

0 & \lambda_2 & \ddots & \vdots\\

\vdots & \ddots & \ddots & 0\\

0 & \ldots & 0 & \lambda_n

\end{array}\,\right)

\]

if and only if # L( \vec{a}_i)=\lambda_i\vec{a}_i# for #i=1,\ldots ,n#.

The diagonal form is related to the following notions.

Eigenvector and eigenvalue

Let # L:V\rightarrow V# be a linear map. A vector #\vec{v}\neq\vec{0}# is called an **eigenvector** of # L # with **eigenvalue** #\lambda# if # L (\vec{v}) = \lambda\vec{v}#.

If #A# is an #(n\times n)#-matrix, then a vector of #\mathbb{R}^n# is called an **eigenvector** of #A# if it is an eigenvector of #L_A# and a number is called an **eigenvalue** of #A# if it is an eigenvalue of #L_A#.

Thus, an eigenvector is a vector distinct from the zero vector that is mapped by # L # onto a scalar multiple of itself; the scalar involved is the corresponding eigenvalue.

The above theorem can also be formulated as follows:

Recognition of diagonal form in terms of eigenvectors

Let # L :V\rightarrow V# be a linear map, where #V# is a vector space of finite dimension #n# and fix a basis #\alpha# for #V#.

The matrix #L_\alpha# is diagonal if and only if #\alpha# is a basis of eigenvectors. In the latter case, the eigenvalues appear on the diagonal.

Give your answer in the form of a list of length #3#.

This follows from the following calculation, where #i=0,1,2#: \[L(x^i) =x\cdot \dfrac{\dd}{\dd x}\left(x^i\right) = x\cdot i\cdot x^{i-1} = i\cdot x^i\]

**Pass Your Math**independent of your university. See pricing and more.

Or visit omptest.org if jou are taking an OMPT exam.