Cofactor Expansion
Cofactor expansion (also called Laplace expansion) is the standard recursive method for computing determinants. It expands the determinant along any row or column, reducing an determinant to a sum of determinants.
Definitions
Let .
The -minor is the determinant of the submatrix obtained by deleting row and column from .
The -cofactor is
The sign pattern follows a checkerboard:
Expansion along row :
Expansion along column :
Both give the same result regardless of which row or column is chosen.
Examples
Expanding along row 1:
Column 1 has a zero entry at position , so one term vanishes:
Same answer. Choosing a row/column with zeros saves work.
Expand along column 2 (two zeros): only rows 3 contributes.
Expand this along column 2: only row 3 contributes.
For an upper triangular matrix, expanding along column 1 repeatedly:
Generalizing: the determinant of a triangular matrix is the product of diagonal entries.
For a block upper triangular matrix where is and is :
This generalizes the triangular matrix result.
The Vandermonde matrix with nodes :
has . So is invertible iff all are distinct.
For : .
The adjugate matrix
The adjugate (or classical adjoint) of is the transpose of the cofactor matrix:
For any matrix :
If is invertible:
, , so .
For :
.
Computing all nine cofactors and transposing gives (since ).
What happens if we expand using entries of row with cofactors of row ()?
This is because the sum equals the determinant of a matrix with two identical rows ( and ), which is zero. This fact is used to prove .
If , then has rank 1. If , then . The adjugate captures the "next level" of singularity.
For a matrix, each cofactor is a determinant. Computing all nine cofactors involves nine determinants. This is more work than row reduction for large matrices, but the adjugate formula is theoretically important.
For a skew-symmetric matrix , expanding along row 1:
Every odd-dimensional skew-symmetric matrix has determinant .
Cofactor expansion has complexity -- impractical for . For numerical computation, Gaussian elimination () is vastly superior. Cofactor expansion remains valuable for theoretical proofs, symbolic determinants, and small matrices.
The determinant connects to invertibility in the Determinant and Invertibility Theorem, and the adjugate formula leads to Cramer's Rule for explicitly solving linear systems.