Orthogonal and symmetric maps: Orthogonal maps
Some properties of orthogonal maps
Here are some general properties of orthogonal maps.
Properties of orthogonal maps Let #V# be a real inner product space and #L:V\to V# an orthogonal map.
- If also #M :V\rightarrow V# is an orthogonal map, then the composition #L\,M# is orthogonal too.
- The map #L# is injective.
- If #V# is finite dimensional, then #L# is invertible and #L^{-1}# is orthogonal.
- Each real eigenvalue of #L# equals #1# or #-1#.
- If #W# is a finite-dimensional linear subspace of #V# which is invariant under #L#, then also the orthogonal complement #W^\perp# is invariant under #L#.
- If #L# fixes the orthogonal complement of a nonzero vector #\vec{v}# of #V#, then #L# is either the identity or the orthogonal reflection #S_{\vec{v}}#.
#A = # # \matrix{-{{1}\over{3}} & {{2}\over{3}} & -{{2}\over{3}} \\ {{2}\over{3}} & -{{1}\over{3}} & -{{2}\over{3}} \\ -{{2}\over{3}} & -{{2}\over{3}} & -{{1}\over{3}} \\ }#
The vector #\rv{1,1,-1}# is fixed by #L# and so lies in the eigenspace with respect to the eigenvalue #1#. Because #L# is orthogonal, the only real eigenvalues are #1# and #-1#. The eigenspace with respect to the eigenvalue #-1# is either #1#-dimensional or #2#-dimensional. In the first case there would be a non-real eigenvalue, and so its complex conjugate would also be an eigenvalue, which is impossible because the dimension of the inner product space is equal to #3#. Therefore, the eigenspace of #L# with respect to #-1# has dimension #2#.
We find a vector perpendicular to both \( \rv{1,1,-1}\) and \(\rv{-1,0,-1}\). This can be found by solving a set of linear equations. A faster method uses the cross product:
\[\rv{1,1,-1}\times \rv{-1,0,-1} = \rv{-1,2,1 }\] This should be an eigenvector of #L# with eigenvalue #-1#. Thus we find that the matrix #L_\beta# of #L# relative to the basis
\[\beta = \basis{\rv{1,1,-1},\rv{-1,0,1},\rv{-1,2,1 }}\] is the diagonal matrix with diagonal entries #1#, #-1#, #-1#. We conclude that the matrix of #L# (relative to the standard basis #\varepsilon#) is equal to
\[\begin{array}{rcl} L_{\varepsilon} &=& {}_\varepsilon I_\beta \,L_\beta \, {}_\beta I_\varepsilon\\
&=&\matrix{1 & -1 & -1 \\ 1 & 0 & 2 \\ -1 & -1 & 1 \\ }\,\matrix{1 & 0 & 0 \\ 0 & -1 & 0 \\ 0 & 0 & -1 \\ } \,\matrix{1 & -1 & -1 \\ 1 & 0 & 2 \\ -1 & -1 & 1 \\ }^{-1} \\
&=&\matrix{1 & 1 & 1 \\ 1 & 0 & -2 \\ -1 & 1 & -1 \\ }\, \matrix{{{1}\over{3}} & {{1}\over{3}} & -{{1}\over{3}} \\ -{{1}\over{2}} & 0 & -{{1}\over{2}} \\ -{{1}\over{6}} & {{1}\over{3}} & {{1}\over{6}} \\ } \\
&=& \matrix{-{{1}\over{3}} & {{2}\over{3}} & -{{2}\over{3}} \\ {{2}\over{3}} & -{{1}\over{3}} & -{{2}\over{3}} \\ -{{2}\over{3}} & -{{2}\over{3}} & -{{1}\over{3}} \\ }
\end{array}\]
The vector #\rv{1,1,-1}# is fixed by #L# and so lies in the eigenspace with respect to the eigenvalue #1#. Because #L# is orthogonal, the only real eigenvalues are #1# and #-1#. The eigenspace with respect to the eigenvalue #-1# is either #1#-dimensional or #2#-dimensional. In the first case there would be a non-real eigenvalue, and so its complex conjugate would also be an eigenvalue, which is impossible because the dimension of the inner product space is equal to #3#. Therefore, the eigenspace of #L# with respect to #-1# has dimension #2#.
We find a vector perpendicular to both \( \rv{1,1,-1}\) and \(\rv{-1,0,-1}\). This can be found by solving a set of linear equations. A faster method uses the cross product:
\[\rv{1,1,-1}\times \rv{-1,0,-1} = \rv{-1,2,1 }\] This should be an eigenvector of #L# with eigenvalue #-1#. Thus we find that the matrix #L_\beta# of #L# relative to the basis
\[\beta = \basis{\rv{1,1,-1},\rv{-1,0,1},\rv{-1,2,1 }}\] is the diagonal matrix with diagonal entries #1#, #-1#, #-1#. We conclude that the matrix of #L# (relative to the standard basis #\varepsilon#) is equal to
\[\begin{array}{rcl} L_{\varepsilon} &=& {}_\varepsilon I_\beta \,L_\beta \, {}_\beta I_\varepsilon\\
&=&\matrix{1 & -1 & -1 \\ 1 & 0 & 2 \\ -1 & -1 & 1 \\ }\,\matrix{1 & 0 & 0 \\ 0 & -1 & 0 \\ 0 & 0 & -1 \\ } \,\matrix{1 & -1 & -1 \\ 1 & 0 & 2 \\ -1 & -1 & 1 \\ }^{-1} \\
&=&\matrix{1 & 1 & 1 \\ 1 & 0 & -2 \\ -1 & 1 & -1 \\ }\, \matrix{{{1}\over{3}} & {{1}\over{3}} & -{{1}\over{3}} \\ -{{1}\over{2}} & 0 & -{{1}\over{2}} \\ -{{1}\over{6}} & {{1}\over{3}} & {{1}\over{6}} \\ } \\
&=& \matrix{-{{1}\over{3}} & {{2}\over{3}} & -{{2}\over{3}} \\ {{2}\over{3}} & -{{1}\over{3}} & -{{2}\over{3}} \\ -{{2}\over{3}} & -{{2}\over{3}} & -{{1}\over{3}} \\ }
\end{array}\]
Unlock full access
Teacher access
Request a demo account. We will help you get started with our digital learning environment.
Student access
Is your university not a partner?
Get access to our courses via Pass Your Math independent of your university. See pricing and more.
Or visit omptest.org if jou are taking an OMPT exam.
Or visit omptest.org if jou are taking an OMPT exam.