Proof: Multiplicativity of Determinants
We prove the fundamental multiplicativity property of determinants, which states that for any square matrices and .
Theorem: For matrices and , .
Proof: We prove this using properties of elementary matrices and the relationship between row operations and determinants.
Step 1: Prove the result when is an elementary matrix.
There are three types of elementary matrices:
Type I (row swap): If swaps two rows, then , and multiplying by on the left swaps two rows of , so:
Type II (row scaling): If multiplies a row by scalar , then , and multiplying by scales a row of by , so:
Type III (row addition): If adds a multiple of one row to another, then , and this operation doesn't change the determinant of :
Thus the theorem holds when is any elementary matrix.
Step 2: Extend to invertible matrices using factorization.
If is invertible, it can be written as a product of elementary matrices:
Then by repeated application of Step 1:
Step 3: Extend to all matrices.
If is not invertible, then . We claim that is also not invertible (so ).
Suppose, for contradiction, that were invertible. Then there exists such that:
Multiplying on the left by (if these exist) or rearranging doesn't work directly. Instead, we use rank arguments:
If is not invertible, then . There exists with .
Then as well, so has nontrivial kernel and cannot be invertible.
Therefore .
More rigorously: , so the columns of are linearly dependent, giving .
Alternative approach for Step 3: View determinant as a function. Define for fixed . This function:
- Is multilinear in the rows of
- Is alternating (swapping rows changes sign)
- Satisfies
By uniqueness of the determinant function with these properties, .
Conclusion: In all cases, . ∎
This proof demonstrates a powerful technique: prove a result for elementary matrices (where calculation is direct), extend to invertible matrices (via elementary matrix factorization), then handle the degenerate case. This strategy appears throughout linear algebra, leveraging the fact that elementary matrices form "building blocks" for all matrices.
The multiplicativity property is the foundation for many applications: computing determinants of matrix products, understanding similarity transformations, and connecting determinants to eigenvalues through characteristic polynomials.