The *Dimension theorem for linear subspaces* gives a relation between the dimensions of the sum and the intersection of two linear subspaces of a given vector space. We now focus on a special situation in which two subspaces span the entire vector space and their intersection is trivial (that is, #\{\vec{0}\}#).

The sum #U+W# of the linear subspaces #U# and #W# of a vector space #V# is called **direct** if

- #U+W =V# and
- #U\cap W = \{\vec{0}\}#.

In this case, #W# is called a **complement** of #U# in #V#.

We write #V = U\oplus W# to indicate #V# is a direct sum of #U# and #W# is.

Of course, #U# is also a complement of #W# in #V#.

If #V = \mathbb{R}^2#, #U =\linspan{\rv{1,0}}# (the #x#-axis), and #W =\linspan{\rv{0,1}}# (the #y#-axis), then #V# is the direct sum of #U# and #W#.

In general, the complement of a subspace is far from unique. Consider #V = \mathbb{R}^2# and #U=\linspan{\rv{1,0}}# the #x#-axis. For any real number #a# the linear subspace #W_a=\linspan{\rv{a,1}}# is a complement of #U#. The #y#-axis, that is, the subspace #W_0#, is just one of many complements of #U# in #V#.

The vector space #V# is always a direct sum of #V# and #\{\vec{0}\}#. This trivial direct sum is not interesting. Often, the direct sum is reduced to reduce all kinds of questions about #V# to questions about the two subspaces in the direct sum. For example, we can construct a basis of #V# by finding a basis of #U# and basis of #W#; see below.

For more than two linear subspaces, the direct sum is also defined. For example, if #T#, #U#, and #W# are linear subspaces of #V#, then we call #V# the direct sum of #T#, #U#, and #W# if \[V = T\oplus (U+W) = U\oplus (T+W) = W\oplus(T+U)\] In particular, it is not sufficient to require that every pair of #T#, #U#, and #W# yields a direct sum of their span.

If #V# has basis #\basis{\vec{a}_1,\ldots,\vec{a}_n}#, then \[V = \linspan{\vec{a}_1}\oplus \cdots\oplus\linspan{\vec{a}_n}\]

The following statements for linear subspaces #U# and #W# of a vector space #V# are equivalent.

- #V = U\oplus W#
- For each vector #\vec{v}# in #V# there are unique vectors #\vec{u}# in #U# and #\vec{w}# in #W# such that #\vec{v}=\vec{u}+\vec{w}#.

If #\dim{V} # is finite, then each of these statements is also equivalent to each of the following two statements:

- #\dim{U}+\dim{W} = \dim{U+W} = \dim{V}#.
- A basis of #U# together with a basis of #W# is a basis of #V#.

Set #V= U\oplus W#. By the definition of the sum of linear subspaces, there are vectors #\vec{u}# in #U# and #\vec{w}\in W# such that #\vec{v}=\vec{u}+\vec{w}#. If #\vec{v} = \vec{u_1}+\vec{w_1}# for certain vectors #\vec{u_1}# in #U# and #\vec{w_1}\in W#, then rewriting the equality gives

\[\vec{u}-\vec{u_1} = \vec{w_1}-\vec{w}\in U\cap W\]

Because #U\cap W = \{\vec{0}\}#, this implies # \vec{u}-\vec{u_1} = \vec{w_1}-\vec{w}= \vec{0}#, so # \vec{u_1}=\vec{u} # and # \vec{w_1}=\vec{w}#. Thus, the uniqueness of #\vec{u}# and #\vec{w}# have been established.

Conversely, suppose that, for each vector #\vec{v}# in #V# there are unique vectors #\vec{u}# in #U# and #\vec{w}# in #W# such that #\vec{v}=\vec{u}+\vec{w}#. Then each vector #\vec{v}# in #V# belongs to #U+W# so #U+W = V#. we will show that #U\cap W = \{\vec{0}\}#. Suppose, for this purpose, that #\vec{u}\in U\cap W#. Then #\vec{u}# can be written in two ways as the sum of a vector in #U# and a vector in #W#, namely

\[\begin{array}{rclcrclcr}\vec{u}&=&\vec{0}&+&\vec{u} &=& \vec{u}&+&\vec{0} \\ &&\text{in}&&\text{in}&&\text{in}&&\text{in}\\ &&U&&W&&U&&W\end{array}\]

The uniqueness condition in this case means that #\vec{u}=\vec{0}#. This proves that #U\cap W# equals #\{\vec{0}\}#.

For the proof of equivalence of these statements with the third statement, first assume that #\dim{U}+\dim{W} = \dim{U+W}#. Then, thanks to the *Dimension theorem for linear subspaces*, #\dim{U\cap W}=0#, which is equivalent to #U\cap W=\{\vec{0}\}#. If \(\dim{U+W} = \dim{V}\) , we must have #U+W= V# in view of *Property 1 of linear subspaces*. From this we conclude that the equalities #\dim{U}+\dim{W} = \dim{U+W} = \dim{V}# imply that #V# is the direct sum of #U# and #W#.

Conversely, if #V=U\oplus W#, then #\dim{U+W} = \dim{V}# because #V = U+W#, and, because of the *Dimension Theorem for linear subspaces*, \[0=\dim{U\cap W} = \dim{U}+\dim{W}-\dim{U+W} = \dim{U}+\dim{W}-\dim{V}\] so #\dim{V} = \dim{U}+\dim{W}#. Thus we have derived both equalities #\dim{U+W} = \dim{V} = \dim{U}+\dim{W}#.

Finally, we deduce that the third statement is equivalent to the last statement. If a basis of #U# and a basis of #W# together form a basis of #V#, then #V# has a basis of #\dim{U}+\dim{W}# vectors, so #\dim{V } = \dim{U}+\dim{W}#. Since each vector of #V# can be written as a linear combination of such a basis, it is also a sum of a vector from #U#, and a vector from #W#, so #V = U+W#, and therefore also #\dim{V} = \dim{U+W}#.

Conversely: Set # \dim{U}+\dim{W} = \dim{U+W} = \dim{V}# and let #\basis{\vec{u_1},\ldots,\vec{u_m}}# be a basis of #U#, and #\basis{\vec{w_1},\ldots,\vec{w_n}}# a basis #W#. Because #U+W = V#, their union #\basis{ \vec{u_1},\ldots,\vec{u_m},\vec{w_1},\ldots,\vec{w_n}}# spans the entire vector space #V#.

Suppose now that a linear combination of these vectors is equal to zero:

\[ \lambda_1\vec{u_1}+\cdots+\lambda_m\vec{u_m}+\mu_1\vec{w_1}+\cdots+\mu_n\vec{w_n} = \vec{0}\]

where #\lambda_1,\ldots,\lambda_m,\mu_1,\ldots,\mu_n# are scalars. Then \[ \lambda_1\vec{u_1}+\cdots+\lambda_m\vec{u_m}=-\mu_1\vec{w_1}-\cdots-\mu_n\vec{w_n}\] is a vector of #U\cap W#, which is equal to #\{\vec{0}\}#, so the vector is equal to #\vec{0}#. The left-hand side is a linear combination of vectors from a basis of #U#, so each #\lambda_i# is equal to #0#, and the right-hand side is a linear combination of vectors from a basis of #W#, so each #\mu_j# is equal to #0#. This shows that all coefficients of the linear combination are equal to #0#, with implies that the union is linearly independent. The conclusion is that #\basis{ \vec{u_1},\ldots,\vec{u_m},\vec{w_1},\ldots,\vec{w_n}}# is a basis of #V#.

Let #P# be the vector space of all polynomials in #x#. Show that this vector space is the direct sum of linear subspaces of even and odd polynomials:

\[\begin{array}{rcl} U &=& \{f(x)\in P\mid f(-x) = f(x)\}\\

W &=& \{f(x)\in P \mid f(-x) = -f(x)\}\end{array}\]

Solution We first show that every polynomial #f(x)# in #P# is the sum of an even and an odd polynomial. For this purpose we consider the polynomials \[f_+(x) = \frac{1}{2}(f(x)+f(-x))\phantom{xx}\text{ and }\phantom{xx} f_-(x) = \frac{1}{2}(f(x)-f(-x))\]

From #f_+(-x) = \frac{1}{2}(f(-x)+f(x)) = f_+(x)# it follows #f_+(x)# is even, and from #f_-(-x) = \frac{1}{2}(f(-x)-f(x)) = -f_-(x)# that #f_-(x)# is odd. Thus, we have shown that #P = U+W#.

To complete the proof that #P# is the direct sum of #U# and #W#, we need to show that #U\cap W# consists only of the zero vector. If #f(x)\in U\cap W#, then \[\begin{array}{rcl}f(x)& =& f(-x)\\ &&\phantom{xxx}\color{blue}{f(x)\in U}\\&=&-f(x)\\&&\phantom{xxx}\color{blue}{f(x)\in W}\end{array}\] so #2f(x) = 0#, which implies that #f(x) = 0#. The proof is complete.