Previously we have seen some invertibility criteria for linear maps. Thanks to the theorem Linear map determined by the image of a basis this also provides invertibility criteria for matrices. We will add another criterium, in terms of the rank.
Let be a natural number. For each -matrix the following statements are equivalent:
- The rank of is
- The rows of are independent
- The columns of are independent
- The reduced echelon form of is the identity matrix
- The matrix is invertible
and : If the rank of is equal to , then both the rows as well as the columns span an -dimensional space, and this means they are independent.
and : If the rows (or columns) are independent, then the rows (or columns) span an -dimensional space, and the rank of the matrix is equal to . Moreover, the column space (or row space) is then also -dimensional and hence the columns (or rows) are independent.
Hence, statements 1, 2 and 3 are equivalent.
: The matrix and the reduced echelon form of have the same rank. From the structure of the reduced echelon form, we can see right away that the matrix has rank if and only if it is an identity matrix.
: By the theorem Linear map determined by the image of a basis, is invertible if and only if , meaning, if and only if the rank of is equal to .
The given list can be supplemented as follows:
- is injective
- is surjective
According to theorem Invertibility with the same dimensions for domain and codomain statements 6 and 7 are equivalent to statement 5.
According to the Criteria for injectivity en surjectivity statements 6 and 7 are equivalent to statements 8 respectively 9. Besides, statements 8 and 9 are equivalent to in accordance with the rank-nullity theorem. According to Rank is dimension column space, the last equation is equivalent to , which confirms the equivalence of statements 1 and 9 directly.
The equivalence of statements 10 and 5 will be proven later in Invertibility in terms of determinant. In that statement we will also prove on the condition that exists. Together with statement 10 it then follows that every statement in the list for is equivalent to the corresponding statement for , if the inverse exists.
Later in Determinant of transpose and product we prove . Together with statement 10 it then follows that every statement in the list for is equivalent to the corresponding statement for .
According to the dependence criterion statements 2 and 3 are equivalent to the statements that there does not exists a non-trivial relation between the rows respectively columns of .
A -matrix satisfies the conditions if and only if .
Let . The columns of are linearly independent if and only if there exists a non trivial linear combination resulting in the zero vector, hence if there are scalars and , not both equal to , such that
and
so or . Just as: or .
Conclusion: The columns of are linearly independent if and only if . This condition is the same for as for . This explains that the rank of is equal to if and only if the rank of would be equal to . This shows once again that both statement 2 and 3 are equivalent with statement 1.
Statement 5 can even be illustrated by means of the following, easy to calculate, formula
If , then is the inverse of . Now assume that . If is the zero matrix, then is not invertible. If is unequal to the zero matrix, then some column of is unequal to and belongs to the kernel of , so is not invertible. We conclude that is invertible if and only if .
The expression is known as the determinant of , which we will see later.
Is the following matrix invertible?
Yes
We will approach this just like inverting a matrix: we augment the matrix with an identity matrix and apply
Gaussian elimination:
The left-hand matrix of the result has rank 3. Hence, the answer is: Yes.
Row reduction of augmented with the -identity matrix not only shows that is invertible, but also that the inverse is equal to the right-hand -matrix of the result.