We will study the subspaces spanned by the rows of a matrix and the columns of a matrix, respectively. Next we will take a look at the relation with systems of equations.
The column space was dealt with before, but will be introduced once more in conjunction with the row space:
Consider the real #(m\times n)#-matrix #A#.
- Each row of #A# is of length #n#, so the rows belong to #\mathbb{R}^n#. The subspace of #\mathbb{R}^n# spanned by the rows is called the row space of #A#.
- Similarly, each column of #A# belongs to #\mathbb{R}^m#. The subspace of #\mathbb{R}^m# spanned by the columns is called the column space of #A#.
Sequences of numbers of length #n#, even when written as a column, can be considered as elements of #\mathbb{R}^n#, and rows can be considered columns, if convenient. Of course, we will only do this when it will not result in any confusion. For example, we write: the system #A\vec{x}=\vec{b}# with #\vec{x}\in\mathbb{R}^n#, while #\vec{x}# is a column vector.
The matrix
\[ A=\left(\begin{array}{cccc}
1&-1&3&7 \\
2&1&1&5
\end{array}\right) \] has row space
\[
\linspan{\rv{1,-1,3,7},\rv{2,1,1,5}}\ \subseteq \mathbb{R}^4
\] and column space
\[
\linspan{\cv{1\\ 2},\cv{-1\\ 1},\cv{3\\ 1},\cv{7\\ 5}}\ \subseteq \mathbb{R}^2
\]
Previously we saw that the column space is equal to the image of the linear map #L_A# determined by #A#. Since the columns of #A^\top# span the row space of #A#, the row space of #A# is equal to the image of the linear map #L_{A^\top}# determined by #A^\top#.
In general, the row space and column space of a matrix are not related. They can even be subspaces of different vector spaces. Still, their dimensions are equal, as we will see below.
In theory Basis and echelon form we have seen that the dimension of the row space is the same as the rank of the matrix. The notion of rank was introduced before as the number of rows distinct from the zero row in a row echelon form of the matrix. We will write #\text{rank}(A)# for the rank of #A#.
We will now show that the rank is equal to the dimension of the column space.
For every matrix #A# the dimension of the row space is equal to the dimension of the column space. This number is equal to the rank of #A#.
The matrix product #P\,Q# of two general matrices #P# and #Q# whose dimensions are such that the product is well defined, is equal to a matrix #R# whose columns are linear combinations of the columns of #P# and whose rows are linear combinations of the rows of #Q#. The proof of this statement is immediate from the definition of the matrix product. It is illustrated in the following example: \[
\underbrace{\matrix{ 1 & 3 & -1 \\ 2 & -2 & 5 }}_{P}\underbrace{\matrix{ 3 & 1\\ 2 & -4 \\ 6 & 2 }}_{Q}=\underbrace{\matrix{ 3 & -13\\ 32 & 20 }}_{R}\] The first column of #R# is a linear combination of the columns of #P#: \[ \matrix{1 & 3 & -1 \\ 2 & -2 & 5}\cv{3 \\ 2 \\ 6}=3\cv{1\\ 2}+2\cv{3\\ -2}+6\cv{-1\\ 5}=\cv{3\\32}\] and so is the second column of #R#: \[
\matrix{1 & 3 & -1 \\ 2 & -2 & 5}\cv{1 \\ -4 \\ 2}=\cv{1\\ 2}-4\cv{3\\ -2}+2\cv{-1\\ 5}=\cv{-13\\20}\] Furthermore, the first row of #R# is a linear combination of the rows of #Q#: \[
\matrix{1&3&-1}\matrix{ 3 & 1\\ 2 & -4 \\ 6 & 2 }=\matrix{3&1}+3\matrix{2&-4}-\matrix{6&2}=\matrix{3& -13}\] and so is the second row of #R#: \[
\matrix{2&-2&5}\matrix{3 & 1\\ 2 & -4 \\ 6 & 2}=2\matrix{3&1}-2\matrix{2&-4}+5\matrix{6&2}=\matrix{32&20}\]
Now assume that the columns of a given #(m\times n)#-matrix #A# span a subspace of #\mathbb{R}^m# of dimension #k#. Let #\basis{\vec{c}_1,\ldots ,\vec{c}_k}# be a basis of this space. Collect these vectors as columns in an #(m\times k)#-matrix #C#. Each of the #n# columns of #A# is a linear combination of these #k# columns; these linear combinations can be summarized in one matrix product
\[
C\, X=A
\] in which #X# is a #(k\times n)#-matrix. Now we will concentrate on the rows in this equality. The matrix equation #C\,X=A# can also be read as: every row of #A# is a linear combination of the rows of #X#. Since the number of rows of #X# is equal to #k#, the dimension of the row space cannot be greater than #k#. Hence,
\[
\dim{{\rm row space}} \leq \dim{{\rm column space}}
\] By applying this inequality to #A^{\top}# and noting that the dimension of the row space (respectively, column space) of #A^{\top}# is equal to the dimension of the column space (respectively, row space) of #A# we also find
\[
\dim{{\rm column space}} \leq \dim {{\rm row space }}
\] The two inequalities imply that the two dimensions are equal: \(\dim{{\rm column space}} = \dim {{\rm row space}}\), which is what we wanted to prove.
Let #A# be an #(m\times n)#-matrix. In this proof we use the equality
\[\dotprod{(A\vec{x})}{\vec{y}}= \dotprod{\vec{x}}{(A^\top\vec{y})}\]
where #\vec{x}# belongs to #\mathbb{R}^n# and #\vec{y}# belongs to #\mathbb{R}^m#. We will regard both vectors as column vectors. The standard dot product on the left hand side is defined for vectors of #\mathbb{R}^m# and the standard dot product on the right hand side for vectors of #\mathbb{R}^n#. The equality follows from the fact that, for #(1\times1)#-matrices #(a)# we have #(a)^\top = (a)#, and the product rule for transposed matrices \((A\,B)^{\top}=B^{\top}A^{\top}\), so both sides can be written as the matrix product \(\vec{x}^\top A^\top\vec{y}\), in which #\vec{x}^\top# is the row vector associated with #\vec{x}#, viewed as a #(1\times n)#-matrix.
In terms of the linear maps determined by #A# and #A^\top# the equality can be written as
\[\dotprod{L_A(\vec{x})}{\vec{y}}= \dotprod{\vec{x}}{L_{A^\top}(\vec{y})}\]
From this we deduce:
\[\left(\im{L_A}\right)^\perp = \ker{L_{A^\top}}\]
This can be seen in the following way
\[\begin{array}{rcl}\vec{y}\in \left(\im{L_A}\right)^\perp&\Leftrightarrow&\dotprod{L_A(\vec{x})}{\vec{y}}=0\text{ for all }\vec{x}\in\mathbb{R}^n\\ &&\phantom{xx}\color{blue}{\text{definition of im and }\perp}\\ &\Leftrightarrow&\dotprod{\vec{x}}{L_{A^\top}(\vec{y})}=0\text{ for all }\vec{x}\in\mathbb{R}^n\\&&\phantom{xx}\color{blue}{\text{see above}}\\&\Leftrightarrow&L_{A^\top}(\vec{y})=\vec{0}\\ &&\phantom{xx}\color{blue}{\text{only }\vec{0}\text{ is perpendicular to all }\vec{x}\in\mathbb{R}^n}\\ &\Leftrightarrow&\vec{y}\in\ker{L_{A^\top}}\\ &&\phantom{xx}\color{blue}{\text{definition of ker}}\\ \end{array}\]
From this equality of subspaces follows an equality of the corresponding dimensions. From this we reduce the following statement:
\[\begin{array}{rcl}\dim{\left(\im{L_A}\right)^\perp} &=&\dim{ \ker{L_{A^\top}}}\\ m-\dim{\im{L_A}} &=&m-\dim{ \im{L_{A^\top}}}\\ \dim{\im{L_A}} &=&\dim{ \im{L_{A^\top}}}\end{array}\]
The right-hand side of the second equality is equal to the right-hand side of the first equality on account of the Rank-nullity theorem for linear maps.
Because \(\im{L_A}\) is the column space of #A# and \(\im{L_{A^\top}}\) the row space, the statement follows from the last equality.
From the proofs we can deduce that the rank of an #(m\times n)#-matrix is not greater than the minimum of #m# and #n#. The rank can only be equal to #0# if the matrix is a null matrix.
A nonzero #(m\times n)#-matrix #A# has rank #1# if and only if there is a column vector #P# of length #m# (that is, an #(m\times 1)#-matrix) and a row vector #Q# of length #n# (that is, a #(1\times n)#-matrix) such that #A=P\, Q#. The proof can be found in the first proof of the current theorem.
For example \[A=\matrix{1&2\\ 3&6} = \matrix{1\\ 3}\,\matrix{1&2}\]
More general: the first proof of the theorem shows that #A# has a rank no higher than #k# if and only if there are an #(m\times k)#-matrix #P# and a #(k\times n)#-matrix #Q# such that #A = P\, Q#.
In Solvability of systems of linear equations we saw that elementary operations do not change the rank of a matrix, and therefore the (in)dependency of the rows of that matrix does not change either under elementary operations. Now that we know that the dimension of the row space is equal to the dimension of the column space, we see that elementary operations also do not change the (in)dependency of the columns of a matrix.
Previously we concluded that the system of equations #A\vec{x}=\vec{b}# has a solution if and only if #\vec{b}# belongs to the column space of #A#, and that there is only one solution if the columns are independent. Thanks to the statement above we can conclude the following.
Let #A# be an #(m\times n)#-matrix. The system of equations \[A\vec{x}=\vec{b}\] has no more than one solution for each vector #\vec{b}# in #\mathbb{R}^m# if and only if the rank of #A# is equal to #n#. In that case, we have #m\ge n#.
If #\vec{b}# does not lie in #\im{A}#, there are no solutions. If #\vec{b}# does lie in #\im{A}#, then each solution #\vec{x}# consists of the coordinates of #\vec{b}# with respect to the basis of #\im{A}# consisting of the #n# columns of #A#. These coordinates are unique since the columns form a basis.
In case the rank of #A# is equal to #n#, the inequality #m\ge n# follows as if #m\lt n#, the matrix #A# would have rank less than or equal to #m#, hence less than #n#, a contradiction.
Determining the rank of a matrix is straightforward: we row reduce the matrix to an echelon form and count the number of rows distinct from the zero row. The statement above shows that we can also determine the rank by column reduction.
Determine the rank of the matrix \[A=\matrix{1 & 0 & 3 & -5 \\ 1 & 0 & 3 & -5 \\ 2 & 0 & 6 & -10 \\ }\]
#\text{rank}(A)=# #1#
With the aid of elementary row operations we reduce the matrix to the reduced echelon form: \[ \begin{array}{rcl}A = \matrix{1&0&3&-5\\1&0&3&-5\\2&0&6&-10\\}&\sim\matrix{1&0&3&-5\\0&0&0&0\\2&0&6&-10\\}&{\color{blue}{\begin{array}{c}\phantom{x} R_2-R_1\phantom{x}\end{array}}}\\&\sim\matrix{1&0&3&-5\\0&0&0&0\\0&0&0&0\\}&{\color{blue}{\begin{array}{c}\phantom{x} R_3-2R_1\end{array}}}\end{array}\] Because the rank is the number of non-null rows of this matrix, we conclude that the rank of the matrix #A# equals #1#.
In the given solution, we have reduced the matrix #A# to the reduced echelon form, although it is sufficient to reduce #A# to any echelon form.