The Invertible Matrix Theorem
The Invertible Matrix Theorem collects a large number of equivalent conditions for a square matrix to be invertible. It is a grand unifying result that connects many concepts from linear algebra.
Statement
Let be an matrix over a field . The following are equivalent:
(a) is invertible.
(b) is row equivalent to (the RREF of is ).
(c) has pivot positions.
(d) The equation has only the trivial solution ().
(e) The columns of are linearly independent.
(f) The linear transformation is injective.
(g) The equation has a solution for every .
(h) The columns of span .
(i) The linear transformation is surjective.
(j) is an isomorphism (bijective).
(k) There exists an matrix with (left inverse).
(l) There exists an matrix with (right inverse).
(m) is invertible.
(n) The columns of form a basis for .
(o) .
(p) .
(q) (see Chapter 4).
(r) is not an eigenvalue of (see Chapter 5).
(s) The row vectors of are linearly independent.
(t) The row vectors of form a basis for .
Examples illustrating the equivalences
.
- (a): .
- (b): RREF is .
- (c): Two pivots.
- (d): .
- (e): and are independent.
- (o): rank .
- (q): .
.
- (a): Not invertible.
- (b): RREF is .
- (c): Only one pivot.
- (d): (nontrivial solution).
- (e): (dependent).
- (g): has no solution ().
- (o): rank .
- (q): .
.
Row reduce: : . : .
Only 2 pivots, rank . Not invertible by (c) and (o).
For small matrices, computing is the fastest test:
- : . Singular.
- : . Invertible.
This follows from . Alternatively: rank rank (row rank column rank), so has rank iff does.
is invertible iff both and are invertible. If or is singular, then .
is invertible iff all . The rank is the number of nonzero diagonal entries.
An upper (or lower) triangular matrix is invertible iff all diagonal entries are nonzero. The determinant equals the product of the diagonal entries.
is invertible iff its columns form a basis for . This means: every vector in can be expressed uniquely as a linear combination of the columns. The system always has a unique solution.
is singular iff is an eigenvalue: for some , meaning . The characteristic polynomial vanishes at iff .
For square matrices, the existence of a left inverse (CA = I) automatically implies the existence of a right inverse (and they are equal). This is a consequence of the finite-dimensionality of : injectivity implies surjectivity for square matrices.
For non-square matrices, left and right inverses can exist independently and are generally different.
is technically invertible (), but it is ill-conditioned: small changes in the input produce large changes in the solution. The condition number measures this sensitivity.
Proof sketch
The equivalences are proved by establishing a cycle of implications:
(a) (d): .
(d) (e) (f) (p): Kernel is trivial iff columns are independent iff is injective iff nullity .
(f) (i) (j): For maps , injective surjective bijective (by Rank-Nullity).
(j) (a): Bijective linear map has a linear inverse.
(d) (b) (c): Trivial null space iff RREF has pivots iff RREF is .
The determinant characterization (q) follows from the properties of determinants (Chapter 4).
The Invertible Matrix Theorem grows as we learn new concepts. The eigenvalue condition (r) is added in Chapter 5. After introducing determinants, condition (q) provides the most concise scalar criterion for invertibility.