Injectivity and surjectivity are related to the null space and image space of a linear mapping.
Let # L :V \rightarrow W# be a linear map.
- The mapping # L # is injective if and only if #\ker{L}= \{ \vec{0} \} #.
- The mapping # L # is surjective if and only if #\im{L}= W #.
Suppose furthermore that #V# has finite dimension #n#, and that #\basis{\vec{a}_1,\ldots,\vec{a}_n}# is a basis of #V#.
- The mapping # L # is injective if and only if #\basis{L(\vec{a}_1),\ldots,L(\vec{a}_n)}# is linearly independent.
- The mapping # L # is surjective if and only if #\linspan{L(\vec{a}_1),\ldots,L(\vec{a}_n)}= W #.
We prove the first statement about injectivity first:
#\Rightarrow#: We have to prove that # L( \vec{x}) = L( \vec{y})# implies # \vec{x} = \vec{y}# for all #\vec{x},\vec{y}\in V#. If # L (\vec{x}) = L( \vec{y})#, then #\vec{0} = L (\vec{x}) - L( \vec{y}) = L ( \vec{x} - \vec{y})# so # \vec{x} - \vec{y}\in \ker{L}#. Because #\ker {L}# consists only of the zero vector, we conclude that # \vec{x} - \vec{y} =\vec{0}#, and so # \vec{x} = \vec{y}#.
#\Leftarrow#: If # \vec{x}\in \ker{L}#, then # L \vec{x} = L \vec{0}#. From the injectivity of #L# it follows that # \vec{x} = \vec{0}#, so the only element of #\ker{L}# is the zero vector.
The first statement about surjectivity follows directly from the definition of surjectivity: this indicates that #L# is surjective if and only if each vector in #W# can be written as the image of a vector in #V#, so as an element of #\im{L}#.
Let #n# be a natural number and let #\basis{\vec{a}_1,\ldots,\vec{a}_n}# be a basis of #V#.
A vector #\vec{x}\in V# belongs to #\ker{L}# if and only if #L\vec{x} = \vec{0}#. If we write #\vec{x}# as a linear combination of the given basis, and use the linearity of #L#, then we see that \[L\vec{x} =x_1\cdot L(\vec{a}_1)+\cdots+x_n\cdot L(\vec{a}_n)\] The kernel #\ker{L}# thus contains a vector distinct from #\vec{0}# if and only if there is a non-trivial relationship \[x_1\cdot L(\vec{a}_1)+\cdots+x_n\cdot L(\vec{a}_n )=\vec{0}\] This proves the first statement
We are ready for the proof of the last statement. If \(\basis{L(\vec{a}_1),\ldots,L(\vec{a}_n)}\) spans the vector space #W#, then #\im{L}=W# because #\im{L}=\linspan{L(\vec{a}_1),\ldots,L(\vec{a}_n)}#. Therefore, the mapping #L# is surjective.
For the proof of the converse, we assume that #L# is surjective, so #\im{L}=W# (because of the criterion for surjectivity). Theorem image as spanned subspace then gives
\[ W=\im{L}=\linspan{L(\vec{a}_1),\ldots,L(\vec{a}_n)}\]
This proves the theorem.
Using this result, we indicate when a linear mapping #V\rightarrow W# with #\dim{V}\lt\infty# has an inverse.
Let # L :V\rightarrow W# be a linear mapping and suppose #\dim{V}\lt\infty#.
- #L# is invertible if and only if #\dim{V}=\dim{W}# and #\ker{L}=\{\vec{0}\}#.
- #L# is invertible if and only if #\dim{\im{L}}=\dim{V}=\dim{W}#.
- Let #\basis{\vec{a}_1,\ldots,\vec{a}_n}# be a basis #V#. The mapping #L# is invertible if and only if #\basis{L(\vec{a}_1),\ldots,L(\vec{a}_n)}# is a basis of #W#.
For this proof we will make use of the Dimension Theorem and the above Criteria for injectivity and surjectivity.
Proof of 1. If # L # has an inverse, then the mapping is a bijection and thus the mapping is injective and surjective. It follows from the above result that then #\ker{L} =\{\vec{0}\}# and #W={\im{L}}#. The Dimension theorem gives \[\begin{array}{rcl}\dim{V}&=&\dim{\ker{L}}+\dim{\im{L}}\\ &=& \dim{\{\vec{0}\}}+\dim{W}\\ &=& \dim{W}\end{array}\] so #\dim{V}=\dim{W}#. Invertibility of #L# therefore implies #\ker{L} =\{ \vec{0}\}# and #\dim{V}=\dim{W}#.
Conversely, we can conclude from the latest facts that # L# has an inverse: for, from #\ker{L} = \{ \vec{0}\}# and the Dimension theorem it follows that #\dim{V} =\dim{{\im{L}}}#; along with the fact #\dim{V}=\dim{W}# this gives #\dim{{\im{L}}} =\dim{W}#. Apparently, #{\im{L}} =W#. Because #\ker{L} =\{\vec{0}\}# and #{\im{L}} =W#, it follows from the above Criteria for injectivity and surjectivity that #L# is injective and surjective, and so that it has an inverse.
Proof of 2. The last statement says that #L# is invertible if and only if # \dim{V}=\dim{W}# and #\ker{L}=\{\vec{0}\}#. The Dimension theorem says \[ \dim{V}=\dim{\ker{L}}+\dim{\im{L}}\]
Because #\ker{L}=\{\vec{0}\}# is equivalent to #\dim{\ker{L}}=0#, we conclude that #L# is invertible if and only if # \dim{V}=\dim{W}# and #\dim{V} =\dim{\im{L}}#.
Proof of 3. This follows directly from the above Criteria for injectivity and surjectivity because #L# is invertible if and only if #L# is injective and surjective, so if and only if #\basis{L(\vec{a}_1),\ldots,L(\vec{a}_n)}# is linearly independent and #\linspan{L(\vec{a}_1),\ldots,L(\vec{a}_n)}= W #. According to the definition Basis and dimension, the vectors #\basis{L(\vec{a}_1),\ldots,L(\vec{a}_n)}# are a basis of #W# if and only if they satisfy the last two conditions.
Verify that, in the first criterion of this theorem, the condition #\ker{L}=\{\vec{0}\}# may be replaced by the condition #\im{L}=W#.
The condition #\dim{V}\lt\infty# is necessary, as is evident from the following example: Let #V = W# be the vector space of all polynomials in #x#. The mapping #L:V\to V# which assigns to each polynomial #p(x)# the polynomial # x\cdot p(x)#, is linear and injective; so here the conditions #\dim{V}=\dim{W}=\infty# and #\ker{L}=\{\vec{0}\}# are satisfied. But there is no polynomial #q(x)# in #V# with #L(q(x))=1#, so #L# is not surjective and hence not invertible.
Let #A# be an #(m\times n)#-matrix. We look at the significance of the theorem for the linear mapping #L_A:\mathbb{R}^n\to \mathbb{R}^m# determined by #A#.
Previously we saw that, if #A# is invertible, the linear mapping #L_A# is invertible, with inverse #L_{A^{-1}}#. This forces #m=n#, in accordance with the first two criteria for invertibility.
Suppose #L_A# is invertible. Then there is a linear mapping #L^{-1}:\mathbb{R}^m\to\mathbb{R}^n# with the property #L_A \,L^{-1} = I_m#, where #I_m# is the identical mapping on #\mathbb{R}^m#. According to the last statement of the theorem Linear map determined by the image of a basis, there is an #(n\times m)#-matrix #B#, such that #L^{-1} = L_B#. Using #L_A \,L^{-1} = I_m# we now find
\[L_{A\,B} = L_A \,L_B = L_A \,L^{-1} =I_m\]
This can only be valid if #A\,B = I#, from which it follows that the matrix #A# is invertible with inverse #B#.
The conclusion is that #L_A# is precisely invertible if #A# is invertible. The criteria come down to the fact that this is the case if and only if #m=n# and the columns of #A# are linearly independent. As we will see later, this means that the rank of #A# is equal to both #m# and #n#.
An important special case occurs when #V# and #W# have equal finite dimension:
For a linear mapping #L: V\to W# between two vector spaces #V# and #W# with equal finite dimensions #\dim{V}=\dim{W}\lt\infty#, the following statements are equivalent:
- #L# is invertible
- #L# is injective
- #L# is surjective
It is known that #L# is injective and surjective if and only if it is bijective, and that #L# is bijective if and only if it is invertible. It is therefore sufficient to prove that #L# is injective if and only if it surjective.
Suppose that #L# is injective. According to the criterion for injectivity, then #\ker{L}=\{\vec{0}\}#, which is equivalent to #\dim{\ker{L}}=0#. By the Dimension theorem,
\[
\dim{V}=\dim{\ker{L}}+\dim{\im{L}}
\] so #\dim{V}=\dim{\im{L}}#. In our case, #\dim{V}=\dim{W}#, so #\dim{\im{L}}=\dim{W}#. Because #\im{L}# is a subspace of #W#, it follows that #\im{L}=W#. The criterion for surjectivity implies that #L# is surjective.
Reading the above reasoning backwards, we also see that the assumption that #L# is surjective, implies that #L# is injective.
This proves the theorem.
The assumption that #V# and #W# have equal finite dimension is necessary:
- If #V = W=P#, the vector space of all polynomials in #x#, then differentiation, the mapping #L = \frac{\dd}{\dd x}:P\to P#, is a surjective linear map with #\ker{L}=\linspan{1}#. The mapping #L# is not injective. Here, the condition that the dimension #V# be finite is not satisfied.
- If #V = \mathbb{R}^n# and #W = \mathbb{R}^m# and #L:V\to W# is an injective linear mapping, then, by the Dimension theorem, #\dim{\im{L}}=n#. In this case #L# is surjective only if #m =n#.
- If #V = \mathbb{R}^n# and #W = \mathbb{R}^m# and #L:V\to W# is a surjective linear mapping, then, according to the Dimension theorem, #\dim{\ker{L}}=n-m#. In this case #L# is injective only if #m =n#.