Inner Product Spaces: Complex inner product spaces
Orthogonal complements in complex inner product spaces
The theory of orthogonality in the complex case has many similarities with the real case.
Perpendicularity
Let #V# be a complex inner product space.
- Suppose that #W# is a linear subspace of #V#. The orthogonal complement of #W# in #V# is the set
\[
W^\perp =\left\{\vec{x}\in V\mid \dotprod{\vec{x}}{\vec{w}}=0\ \text{ for all }\ \vec{w}\in W\right\}
\] This is a linear subspace of #V# which only has #\vec{0}# in common with #W#. - If #\vec{x}# is a vector of #V# and #W# a finite-dimensional linear subspace of #V#, then there is a unique vector #\vec{y}# in #W# such that #\vec{x}-\vec{y}# is perpendicular to #W#. We call this vector the orthogonal projection of #\vec{x}# on #W# and often denote it by #P_W(\vec{x})#.
We briefly mention some of the results and properties regarding the orthogonal projection and perpendicular vectors. The definition of an orthonormal basis is similar to that in the real case; later we will go into this in greater detail.
Let #V# be a complex inner product space and let #W# be a subspace #W# of #V#. Suppose that #\basis{\vec{a}_1, \ldots ,\vec{a}_k}# is an orthonormal basis of #W# for a natural number #k#. Then the following statements hold.
- Two arbitrary vectors #\vec{a}# and #\vec{b}# are perpendicular to one another if and only if \[\norm{\vec{a}+\vec{b}}^2 = \norm{\vec{a}}^2+\norm{\vec{b}}^2\]
- If the vectors #\vec{a}_1 \ldots ,\vec{a}_k# are mutually perpendicular, that is to say: #\dotprod{\vec{a}_i}{\vec{a}_j }=0# if #i\neq j#, then
\[\norm{\vec{a}_1+\cdots + \vec{a}_k }^2 =\norm{\vec{a}_1}^2 +
\cdots + \norm{\vec{a}_k}^2\] - #\vec{x}-P_W(\vec{x})# is perpendicular to each vector from #W#.
- The orthogonal projection #P_W(\vec{x})# is given by #(\dotprod{\vec{x}}{\vec{a}_1})\vec{a}_1 + \cdots +(\dotprod{\vec{x}}{\vec{a}_k})\vec{a}_k#.
- #\norm{\vec{x}-P_W(\vec{x})}=\min_{\vec{z}\in W} \norm{\vec{x}-\vec{z}}#, that is to say, the distance from #\vec{x}# to a vector from #W# is minimal for #P_W(\vec{x})#.
- The perpendicular projection is the unique vector for which this minimum occurs.
- #\norm{P_W(\vec{x})}\leq\norm{\vec{x}}# with equality if and only if #\vec{x}=P_W(\vec{x})#.
- We have #P_W(\vec{x})=\vec{x}# if and only #\vec{x}# belongs to #W#.
Finally, we give a thinned-out version of the dimension formula as discussed previously for the real case.
Dimension formula
Let #V# be a finite-dimensional complex inner product space. For each linear subspace #W# of #V# we have \[ \dim{V}=\dim{W}+\dim{W^{\perp}}\]
In order to obtain an orthonormal basis for #W#, we normalize the vector # \rv{ \complexi+1 , 1-\complexi } #: \[\frac{1}{\norm{\rv{ \complexi+1 , 1-\complexi } }} \cdot \rv{ \complexi+1 , 1-\complexi } = \frac{1}{2}\cdot \rv{ \complexi+1 , 1-\complexi } =\rv{ {{\complexi+1}\over{2}} , {{1-\complexi}\over{2}} } \]
Next, we calculate the inner product of #\vec{x} # with this vector:\[ \begin{array}{rcl}\displaystyle\dotprod{\vec{x}}{\rv{ {{\complexi+1}\over{2}} , {{1-\complexi}\over{2}} } }&=&\displaystyle\dotprod{\left[ 3 , -\complexi \right] }{ \rv{ {{\complexi+1}\over{2}} , {{1-\complexi}\over{2}} } }\\
&&\phantom{xx}\color{blue}{\text{vector }\vec{x}\text{ substituted}}\\
&=&\displaystyle (3)\cdot\overline{{{\complexi+1}\over{2}}}+(-\complexi)\cdot\overline{{{1-\complexi}\over{2}}}\\
&&\phantom{xx}\color{blue}{\text{definition of complex inner product}}\\
&=&\displaystyle 2-2\, \complexi\\
&&\phantom{xx}\color{blue}{\text{simplified}}
\end{array}\]
We now obtain the orthogonal projection by taking this inner product as the coefficient of the normalized basis vector of #W#: \[P_W(\vec{x})=(2-2\, \complexi) \cdot {\rv{ {{\complexi+1}\over{2}} , {{1-\complexi}\over{2}} } }=\left[ 2 , -2\, \complexi \right] \]
Or visit omptest.org if jou are taking an OMPT exam.