An important concept in the theory of orthogonality is the orthogonal complement. We will discuss what it is and how we can construct the orthogonal complement of a linear subspace in a vector space.
Let #V# be an inner product space and let #W# be a linear subspace of #V#. The orthogonal complement of #W# is the set
\[
W^\perp =\left\{\vec{x}\in V\mid \dotprod{\vec{x}}{\vec{w}}=0\ \text{ for all }\ \vec{w}\in W\right\}
\]
Thus, the orthogonal complement of #W# is the set of all vectors in #V# which are perpendicular to #W#.
The definition of #W^\perp# also works well and is also used if #W# is a subset of #V#.
In the literature, the orthogonal complement is also abbreviated to orthoplement.
\[\{\vec{0}\}^\perp = V\ \text{ and }\ V^\perp =\{\vec{0}\}\]
The second equality can be useful if we want to prove that two vectors, say #\vec{x}# and #\vec{y}#, of #V# are equal. For example, if we know that #\dotprod{\vec{x}}{\vec{z}} = \dotprod{\vec{y}}{\vec{z}}# for each vector #\vec{z}# of #V#, then by use of bilinearity, after moving all terms to the left, we find:
\[ \dotprod{(\vec{x}-\vec{y})}{\vec{z}} =0\phantom{xx}\text{for all }z\text{ in } V\]
This means that #\vec{x}-\vec{y}# belongs to #V^\perp = \{\vec{0}\}#, so #\vec{x}-\vec{y} = \vec{0}#. We conclude that #\vec{x}= \vec{y}#.
The orthogonal complement of #W# is often denoted by #W^\perp#.
Here, the symbol #\perp# depends on the definition of the inner product of #V#. If there are several inner products, we also write #W^{\perp_f}# for the orthogonal complement of #W# with respect to the inner product #f#.
The #2#-dimensional linear subspace #W# of #\mathbb{R}^3# given by the equation \[2x-3y+5z =0\]
is the orthogonal complement of #\linspan{\rv{2,-3,5}}#. The normal vector #\rv{2,-3,5}# of this plane is a spanning vector of the orthogonal complement of #W#.
Here are some key properties of the orthogonal complement.
Let #W# be a linear subspace of the vector space #V#.
- #W^\perp# is a linear subspace of #V#.
- #W\cap W^\perp =\{\vec{0}\}#; that is, a linear subspace #W# and its orthogonal complement #W^\perp# only have the zero vector in common.
- If #W=\linspan{\vec{a}_1,\ldots ,\vec{a}_n} #, then
\[
W^\perp=\left\{\vec{x}\in V\mid \dotprod{\vec{a}_i}{\vec{x}}=0\ \text{ for }\ i=1,\ldots ,n\right\}
\]
1. First, #\vec{0}# belongs to #W^\perp#.
Let #\vec{x}# and #\vec{y}# be two vectors in the orthogonal complement of #W#, let #\lambda# and #\mu# be scalars, and let #\vec{w}# be a vector of #W#. The inner product of #\lambda \vec{x} + \mu \vec{y}# and #\vec{w}# satisfies
\[\begin{array}{rcl}\dotprod{(\lambda \vec{x} + \mu \vec{y})}{\vec{w}}
&=&\lambda\cdot(\dotprod{\vec{x}}{\vec{w}})+\mu\cdot(\dotprod{\vec{y}}{\vec{w}})\\ &&\phantom{xxx}\color{blue}{\text{linearity of inner product}}\\
&=&\lambda\cdot 0 + \mu\cdot 0\\ &&\phantom{xxx}\color{blue}{\vec{x}\text{ and }\vec{y} \text{ belong to }W^\perp}\\
&=&0\end{array}\]
We conclude that #W^{\perp}# satisfies the requirements of the definition of a linear subspace of #V#.
2. Suppose that #\vec{w}# is a vector in both #W# and #W^{\perp}#. Then the inner product #\dotprod{\vec{w}}{\vec{w}}# is equal to #0#. This directly implies that #\vec{w}# is the zero vector. Therefore, the second property is proved.
3. We first prove that \(W^\perp\) is contained in #\left\{\vec{x}\in V\mid \dotprod{\vec{a}_i}{\vec{x}}=0\ \text{ for }i=1,\ldots,n\right\}#. Let #\vec{x}# be a vector of the orthogonal complement of #W#. Then #\vec{x}# is perpendicular to each vector from #W#, and in particular perpendicular to each vector #\vec{a}_i# with #i=1,\ldots, n#. This shows that #\vec{x}# is a member of #\left\{\vec{x}\in V\mid \dotprod{\vec{a}_i}{\vec{x}}=0\ \text{ for }\ i=1,\ldots ,n\right\}#.
Next we prove the other inclusion. Let #\vec{x}# be a vector which is perpendicular to each individual vector #\vec{a}_i#. An arbitrary vector #\vec{w}# in #W# is of the form \(\sum_{i=1}^n\lambda_i \vec{a}_i\) for scalars #\lambda_1,\ldots,\lambda_n#. We show that #\vec{x}# is perpendicular to #\vec{w}#:
\[\begin{array}{rcl}\dotprod{\vec{x}}{\vec{w}}&=&\dotprod{\vec{x}}{(\sum_{i=1}^n\lambda_i \vec{a}_i)}\\
&&\phantom{xxx}\color{blue}{\text{expression for }\vec{w}\text{ used}}\\ &=&\sum_{i=1}^n\lambda_i\cdot (\dotprod{\vec{x}}{\vec{a}_i})\\ &&\phantom{xxx}\color{blue}{\text{linearity of inner product}}\\ &=&\sum_{i=1}^n\lambda_i\cdot 0\\ &&\phantom{xxx}\color{blue}{\vec{x}\text{ is perpendicular to each }\vec{a}_i}\\ &=&0\end{array}\] We conclude that also the third property holds.
For calculating the orthogonal complement of a span #\linspan{ \vec{a}_1, \ldots ,\vec{a}_n }#, we only need to find the vectors that are perpendicular to each of the #n# vectors #\vec{a}_1, \ldots ,\vec{a}_n#.
The vectors in #\mathbb{R}^3# which are perpendicular to #\rv{1,2,-1}# form the orthogonal complement of the line #\ell=\langle \rv{1,2,-\,1}\rangle #. Such a vector #\vec{v}=\rv{x,y,z}# must satisfy #\dotprod{\rv{1,2,-\,1}}{\vec{v}}=0# and hence #x+2y-z=0#. In other words, \[\ell^\perp=\left\{\rv{x,y,z}\mid x+2y-z=0\right\}\] This is a plane through the origin of #\mathbb{R}^3#.
Next, we determine the orthogonal complement of the plane #\ell^\perp#. First we define a parameterization of this plane. When we choose #y# and #z# as parameters, the fact that #\rv{x,y,z}# belongs to #\ell^\perp# means that #x=z-2y#, from which we derive that \[\rv{x,y,z} = z\cdot\rv{1,0,1}+y\cdot\rv{-2,1,0}\] This implies
\[
\ell^\perp= \linspan{\rv{1,0,1},\rv{-2,1,0}}
\] Because of property 3 #\left(\ell^\perp\right)^\perp# consists of all vectors #\rv{x,y,z}# which satisfty
\[
\begin{array}{rcrrr}
\dotprod{\rv{1,0,1}}{\rv{x,y,z}}& = & x & & +z=0\\
\dotprod{\rv{-2,1,0}}{\rv{x,y,z}} & = & -\,2 x & +y & =0
\end{array}
\] In terms of #x# as a parameter, the solutions to this system of linear equations are of the form #x\cdot \rv{1,2,-1}#, so
\[\left(\ell^\perp\right)^\perp=\linspan{ \rv{1,2,-1}} =\ell\] We will see below that, for every linear subspace #W# of a finite-dimensional inner product space #V#, we have \(\left(W^\perp\right)^\perp=W\). In general, the orthogonal complement of a plane through the origin of #\mathbb{R}^3# is a line through the origin, and the plane is the orthogonal complement of that line.
In the comment of the theorem Orthogonal projection we have seen that if #W# is an infinite-dimensional subspace of an inner product space, the orthogonal projection of a vector #\vec{x}# on #W# does not always exist. Using property 2 above, however, we can establish that there is at most one orthogonal projection: Apply the theorem Intersections of affine subspaces to \((\vec{x}+W)\cap W^\perp\). According to this theorem, the intersection is either empty (in which case there is no orthogonal projection) or of the form #\vec{c}+W\cap W^\perp#. According to the second property of the orthogonal complement, the intersection #W\cap W^\perp# is the null space, so, in the second case, the intersection \((\vec{x}+W)\cap W^\perp\) coincides with \(\{\vec{c}\}\). In this second case, #\vec{y} = \vec{x}-\vec{c}# is the unique orthogonal projection of #\vec{x}# on #W#.
If #V# is a finite-dimensional vector space, then we can use the orthogonal projection and the Gram-Schmidt procedure to calculate the orthogonal complement of a subspace #W#.
Let #W# be an #m#-dimensional subspace of an #n#-dimensional vector space #V#. Suppose that #\basis{\vec{a}_1,\ldots,\vec{a}_m}# is a basis of #W# and that this basis extended by #\basis{\vec{a}_{m+1},\ldots,\vec{a}_n}# is a basis for the entire space #V#.
Then, the Gram-Schmidt procedure applied to the basis #\basis{\vec{a}_{1},\ldots,\vec{a}_n}# of #V# gives an orthonormal basis #\basis{\vec{e}_1,\ldots ,\vec{e}_n}# for #V# such that #W=\linspan{\vec{e}_1,\ldots ,\vec{e}_m}# and #W^\perp=\linspan{\vec{e}_{m+1},\ldots ,\vec{e}_n}#.
In particular, \[ \dim{V}=\dim{W}+\dim{W^{\perp}}\]
Starting from the basis #\basis{\vec{a}_{1},\ldots,\vec{a}_n}# we use the Gram-Schmidt procedure and construct an orthonormal basis #\basis{\vec{e}_1,\ldots,\vec{e}_n}# for #V#. This basis satisfies \[ \linspan{\vec{e}_1,\ldots,\vec{e}_m}=\linspan{\vec{a}_1,\ldots,\vec{a}_m}=W\]
We now show that the orthogonal complement #W^{\perp}# is given by #\linspan{\vec{e}_{m+1},\ldots,\vec{e}_n}#. Let #\vec{x}# be a vector of #V#. Since #\basis{\vec{e}_1,\ldots ,\vec{e}_n}# is an orthonormal basis, each #\vec{e}_i# for #i=m+1,\ldots,n# is perpendicular to each #\vec{e}_j# for #j=1,\ldots,m#. According to property 3 of the orthogonal complement, this implies that #\vec{e}_i# for #i=m+1,\ldots,n# belongs to #W^\perp#. Property 1 of the orthogonal complement says that #W^\perp# is a linear subspace, so #\linspan{\vec{e}_{m+1},\ldots,\vec{e}_n}# is contained in #W^\perp #.
To prove the other inclusion, we assume that #\vec{x}# is a vector of #W^\perp#. Thanks to property two of orthonormal systems we find \[\begin{array}{rcl}\vec{x}&=&\sum_{i=1}^n (\dotprod{\vec{x}}{\vec{e}_i})\vec{e}_i\\&&\phantom{xx}\color{blue}{\text{property of orthonormal systems}}\\ &=&\sum_{i=1}^m(\dotprod{\vec{x}}{\vec{e}_i})\vec{e}_i + \sum_{i=m+1}^n(\dotprod{\vec{x}}{\vec{e}_i})\vec{e}_i\\&&\phantom{xx}\color{blue}{\text{sum is split}}\\&=&\sum_{i=m+1}^n(\dotprod{\vec{x}}{\vec{e}_i})\vec{e}_i\\&&\phantom{xx}\color{blue}{\dotprod{\vec{x}}{\vec{e}_i}=0\text{ for }i=1,\ldots,m\text{ since }\vec{x}\text { in }W^\perp}\\\end{array}\]
Thus, we see that \(\vec{x}=\sum_{i=m+1}^n(\dotprod{\vec{x}}{\vec{e}_i})\vec{e}_i\) lies within the linear span #\linspan{\vec{e}_{m+1},\ldots,\vec{e}_n}#. This proves the other inclusion. We conclude that #W^\perp=\linspan{\vec{e}_{m+1},\ldots,\vec{e}_n}#.
Finally, the dimension formula for the orthoplement follows from
\[\dim{W} + \dim{W^\perp}= m+(n-m) = n = \dim{V}\]
If #W# is a linear subspace of a finite-dimensional inner product space, then \[\left(W^\perp\right)^\perp = W\] This immediately follows from applying the theorem to #W^\perp# rather than #W#, because the basis #\basis{\vec{e}_{1},\ldots,\vec{e}_m}# of #W# augments the orthonormal basis #\basis{\vec{e}_{m+1},\ldots,\vec{e}_n}# of #W^\perp# to an orthonormal basis for #V#.
Let #W# be the #2#-dimensional linear subspace of #\mathbb{R}^3# given by \[W=\left\{\rv{x,y,z}\mid x+2y-z=0\right\}\] Then #W# has many complements, that is to say: #1#-dimensional subspaces of #\mathbb{R}^3# which, together with #W# span the whole space. The orthogonal complement #W^\perp=\linspan{\rv{1,2,-1}}# is unique and determines #W# again uniquely by #W = \left(W^\perp\right)^\perp#. This explains the unique role of #\rv{1,2,-1}# as a normal vector (unique up to a nonzero scalar).
The dimension formula for the orthoplement can also be derived from the Dimension theorem for linear subspaces, using the fact that the intersection of #W# and # W^\perp # is trivial (that is to say: property 2 of the orthogonal complement):
\[\dim{V} = \dim{W} + \dim{W^\perp}-\dim{W\cap W^\perp}= \dim{W} + \dim{W^\perp}\]
The theorem shows that #V# is the direct sum of #W# and #W^\perp# for any proper non-trivial linear subspace #W# of #V#.
If #W# is a plane through the origin in #\mathbb{R}^3#, then the dimension of its orthogonal complement is #1#. If #\basis{\vec{u},\vec{v}}# is a basis of #W#, then #W^\perp# is spanned by the cross product of #\vec{u}# and #\vec{w}#.
Determine an orthonormal basis for the orthogonal complement #W^{\perp}# in # \mathbb{R}^3# of the linear subspace #W# given by
\[8 x+4 y-5 z=0\]
Give your answer in the form of a list of basis vectors.
#\left\{{{1}\over{\sqrt{105}}}\cdot\rv{ 8 , 4 , -5 }\right\}#
The subspace #W# consists of all vectors # \rv{x,y,z}# of #\mathbb{R}^3# with the property \(8 x+4 y-5 z=0\); that is, #\dotprod{\rv{8,4,-5}}{\rv{x,y,z}} = 0#. This means #W = { \linspan{\rv{8,4,-5}}}^{\perp}#. As a consequence
\[ W ^\perp= \left(\linspan{\rv{8,4,-5}}^{\perp}\right)^\perp =\linspan{\rv{8,4,-5}}\] Thus, a basis is given by the vector #\rv{8,4,-5}#. It remains for us to normalize this basis vector to achieve an orthonormal basis.
\[\frac{1}{\norm{\rv{8,4,-5}}} \cdot \rv{8,4,-5} ={{1}\over{\sqrt{105}}}\cdot\rv{ 8 , 4 , -5 }\] This way we find the answer #\left\{{{1}\over{\sqrt{105}}}\cdot\rv{ 8 , 4 , -5 }\right\}#.
The solution is not unique: both #\left\{{{1}\over{\sqrt{105}}}\cdot\rv{ 8 , 4 , -5 }\right\}# and #\left\{-{{1}\over{\sqrt{105}}}\cdot\rv{ 8 , 4 , -5 }\right\}# are correct answers.