The following property is similar to what we have seen for orthogonal maps. It is one of the two pillars on which the diagonalizability of symmetric matrices is based.
Let #V# be a real inner product space and #L:V\rightarrow V# a symmetric linear map. If #W# is an #L#-invariant linear subspace of #V#, then also #W^\perp# is invariant under #L#.
Take an arbitrary vector #\vec{x}\in W^\perp#. We prove that #L(\vec{x})\in W^\perp# by showing that #\dotprod{L(\vec{x})}{\vec{y}}=0# for all #\vec{y}\in W#: \[\begin{array}{rcl}\dotprod{L(\vec{x})}{\vec{y}}&=&\dotprod{\vec{x}}{L(\vec{y})}\\&&\phantom{xx}\color{blue}{\text{symmetry of }L}\\&=&0\\&&\phantom{xx}\color{blue}{\vec{x}\in W^\perp\text{ and }L(\vec{y})\in W\text{ because of invariance of }W}\end{array}\]
Suppose that #V# is a finite-dimensional inner product space and # L:V\rightarrow V# is a symmetric linear map with a real number #\lambda# as its only complex eigenvalue. Then #L# is equal to the scalar multiplication #\lambda\, I_V#.
To see this, we use some results on eigenspaces. We write #W# for the kernel of #L-\lambda I_V#. This linear subspace of #V# is invariant under #L#. Assume that #W# is a proper subspace of #V#, so #W^\perp# is not the trivial subspace. By the theorem, #W^\perp# is invariant under #L#. In particular, #L# has an eigenvector with eigenvalue #\lambda#. This means that the eigenvector not only belongs to #W^\perp# but also to #W#. This contradicts #W\cap W^\perp = \{\vec{0}\}# (one of the properties of the orthogonal complement). We conclude that #W=V#, so #L-\lambda I_V# is the zero map, which proves that #L =\lambda\,I_V#.
In the setting of the theorem, the restriction of #L# to #W# as well as the restriction of #L# to #W^\perp# is again symmetric. Because #L# is completely determined by these two restricted maps (bases of #W# and #W^\perp# together form a basis of #V# due to properties of the orthogonal complement), we can break up the study of symmetric linear maps to the study of such maps on generalized eigenspaces. The comment Eigenspaces shows that the generalized eigenspaces are in fact eigenspaces, so #L# is complex diagonalizable. Below we will see that #L# is even real diagonalizable.
The other pillar on which the diagonalizability of symmetric maps rests, is the fact that all the complex eigenvalues of such a map are real.
Let #V# be a real inner product space with #\dim {V}\lt\infty# and let #L:V\rightarrow V# be a symmetric linear map. Then all roots of the characteristic equation of #L# are real.
Suppose that #\mu# is a non-real root of the characteristic equation. Due to the comment about 2D invariant subspaces for real linear maps, there is a two-dimensional invariant linear subspace #U# such that #\left.L\right|_U#, the restriction of #L# to #U#, is a map #U\rightarrow U# with characteristic polynomial #(x-\mu)\cdot (x-\overline{\mu})#. Choose an orthonormal basis #\alpha# for #U#. Then \[
L_\alpha=\matrix{
a & b\\
b & c}
\] is a symmetric matrix due to theorem Symmetric maps and matrices. Its characteristic polynomial is equal to
\[
\left|\,\begin{array}{cc}
a-\lambda & b\\
b & c-\lambda
\end{array}\,\right|\ =\ \lambda^2-(a+c)\lambda+ac-b^2
\] The discriminant of this quadratic polynomial is #(a+c)^2-4ac+4b^2=(a-c)^2+4b^2\geq 0#. Therefore, the two roots are real. This contradicts the fact that the characteristic polynomial equals #(x-\mu)\cdot (x-\overline{\mu})#. We conclude that all roots of the characteristic equation of #L :V\rightarrow V# are real.
Let #V# be a finite-dimensional inner product space and # L:V\rightarrow V# a symmetric linear map. The comment on eigenspaces of the previous theorem taught us that if # L:V\rightarrow V# has a single real number #\lambda# as its only complex eigenvalue, it is equal to the scalar multiplication #\lambda\, I_V#. The current theorem tells us that #L# has real eigenvalues only. By using the aforementioned application for each generalized eigenspace of a symmetric linear map #L#, we find that #L# is diagonalizable. Later we will give a proof of this fact where it turns out that the coordinate transformation conjugating #L# to a diagonal matrix can be chosen to be orthogonal.
The vector #\left[ -1 , -1 \right] # is an eigenvector of the symmetric matrix \[A=\matrix{-1 & 3 \\ 3 & -1 \\ }\] with eigenvalue #2#. Thus, the span of the vector #\left[ -1 , -1 \right] # is invariant under \(A\).
The second eigenvalue of #A# is distinct from #2#.
Determine an eigenvector of #A# corresponding to this eigenvalue.
#\rv{a,b}=# #{\left[ -1 , 1 \right] }#
The orthogonal complement of #\linspan{\left[ -1 , -1 \right] }# in #\mathbb{R}^2# is #1#-dimensional. A spanning vector of it is an eigenvector. Such a spanning vector #\rv{a,b}# can be found by solving the equation
\[{\dotprod{\left[ -1 , -1 \right] }{\rv{a,b}} =0}\] This leads to the equation #-a-b=0#. A solution is #\rv{a,b} = \left[ -1 , 1 \right] #.