TheoremComplete

Determinant and Invertibility

The determinant provides a clean, scalar-valued criterion for invertibility: a matrix is invertible if and only if its determinant is nonzero.


Statement

Theorem4.3Determinant characterizes invertibility

Let A∈MnΓ—n(F)A \in M_{n \times n}(F). Then

AΒ isΒ invertibleβ€…β€ŠβŸΊβ€…β€Šdet⁑(A)β‰ 0.A \text{ is invertible} \iff \det(A) \neq 0.

Equivalently, AA is singular if and only if det⁑(A)=0\det(A) = 0.

Proof

(β‡’\Rightarrow) If AA is invertible, then AAβˆ’1=IAA^{-1} = I, so det⁑(A)det⁑(Aβˆ’1)=det⁑(I)=1\det(A)\det(A^{-1}) = \det(I) = 1. Hence det⁑(A)β‰ 0\det(A) \neq 0.

(⇐\Leftarrow) If det⁑(A)β‰ 0\det(A) \neq 0, then B=1det⁑(A)adj(A)B = \frac{1}{\det(A)}\text{adj}(A) satisfies AB=IAB = I (by the adjugate formula). So AA is invertible with Aβˆ’1=BA^{-1} = B.

Alternatively: row-reduce AA to an upper triangular matrix UU. The row operations multiply the determinant by nonzero scalars and Β±1\pm 1. So det⁑(A)β‰ 0\det(A) \neq 0 iff det⁑(U)β‰ 0\det(U) \neq 0 iff all diagonal entries of UU are nonzero iff UU has nn pivots iff AA is row-equivalent to InI_n iff AA is invertible.

β– 

Examples

Example2 x 2 invertibility test

A=(3512)A = \begin{pmatrix} 3 & 5 \\ 1 & 2 \end{pmatrix}: det⁑=6βˆ’5=1β‰ 0\det = 6 - 5 = 1 \neq 0. Invertible. Aβˆ’1=(2βˆ’5βˆ’13)A^{-1} = \begin{pmatrix} 2 & -5 \\ -1 & 3 \end{pmatrix}.

B=(2613)B = \begin{pmatrix} 2 & 6 \\ 1 & 3 \end{pmatrix}: det⁑=6βˆ’6=0\det = 6 - 6 = 0. Singular.

Example3 x 3 invertibility test

A=(123456789)A = \begin{pmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \\ 7 & 8 & 9 \end{pmatrix}: det⁑=0\det = 0 (the third row is the sum of the other two rows minus the first, or compute directly: 1(45βˆ’48)βˆ’2(36βˆ’42)+3(32βˆ’35)=βˆ’3+12βˆ’9=01(45-48) - 2(36-42) + 3(32-35) = -3 + 12 - 9 = 0). Singular.

B=(1234567810)B = \begin{pmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \\ 7 & 8 & 10 \end{pmatrix}: det⁑=1(50βˆ’48)βˆ’2(40βˆ’42)+3(32βˆ’35)=2+4βˆ’9=βˆ’3β‰ 0\det = 1(50-48) - 2(40-42) + 3(32-35) = 2 + 4 - 9 = -3 \neq 0. Invertible.

ExampleDeterminant of a product

det⁑(AB)=det⁑(A)det⁑(B)\det(AB) = \det(A)\det(B). So:

  • ABAB is invertible iff both AA and BB are invertible.
  • det⁑(An)=(det⁑A)n\det(A^n) = (\det A)^n.
  • det⁑(Aβˆ’1)=1/det⁑(A)\det(A^{-1}) = 1/\det(A).
ExampleA and A^T have same determinant

det⁑(AT)=det⁑(A)\det(A^T) = \det(A), so AA is invertible iff ATA^T is invertible. This means: the rows of AA are linearly independent iff the columns are linearly independent (which also follows from row rank = column rank).

ExampleBlock diagonal determinant

det⁑(A00B)=det⁑(A)β‹…det⁑(B).\det\begin{pmatrix} A & 0 \\ 0 & B \end{pmatrix} = \det(A) \cdot \det(B).

So this block diagonal matrix is invertible iff both AA and BB are invertible.

ExampleParametric invertibility

For which values of tt is A=(1tt1)A = \begin{pmatrix} 1 & t \\ t & 1 \end{pmatrix} invertible?

det⁑(A)=1βˆ’t2=(1βˆ’t)(1+t)\det(A) = 1 - t^2 = (1-t)(1+t). So AA is invertible for all tβ‰ Β±1t \neq \pm 1.

ExampleColumns dependent iff det = 0

det⁑(A)=0\det(A) = 0 means the columns of AA are linearly dependent. Geometrically, the column vectors lie in a lower-dimensional subspace, so the parallelepiped they span has zero volume.

ExampleConnection to eigenvalues

det⁑(A)=∏i=1nλi\det(A) = \prod_{i=1}^n \lambda_i where λi\lambda_i are the eigenvalues (counted with algebraic multiplicity). So det⁑(A)=0\det(A) = 0 iff some eigenvalue is 00 iff AA has a nontrivial kernel.

ExampleDeterminant over finite fields

Over F2={0,1}\mathbb{F}_2 = \{0, 1\}: det⁑(1110)=0βˆ’1=βˆ’1=1β‰ 0\det\begin{pmatrix} 1 & 1 \\ 1 & 0 \end{pmatrix} = 0 - 1 = -1 = 1 \neq 0 (in F2\mathbb{F}_2). The matrix is invertible over F2\mathbb{F}_2. Its inverse is (0111)\begin{pmatrix} 0 & 1 \\ 1 & 1 \end{pmatrix}.

ExampleOrthogonal matrices

If AA is orthogonal (ATA=IA^T A = I), then det⁑(A)2=det⁑(AT)det⁑(A)=det⁑(I)=1\det(A)^2 = \det(A^T)\det(A) = \det(I) = 1, so det⁑(A)=±1\det(A) = \pm 1. The special orthogonal group SO(n)\text{SO}(n) consists of orthogonal matrices with det⁑=+1\det = +1 (rotations).

ExampleSingular matrices form a hypersurface

The set {A∈MnΓ—n(R)∣det⁑(A)=0}\{A \in M_{n \times n}(\mathbb{R}) \mid \det(A) = 0\} is a single polynomial equation in n2n^2 variables. It is a hypersurface of dimension n2βˆ’1n^2 - 1 in MnΓ—n(R)β‰…Rn2M_{n \times n}(\mathbb{R}) \cong \mathbb{R}^{n^2}. The invertible matrices form the complement -- a dense open set.

ExampleSimilar matrices have equal determinants

If B=Pβˆ’1APB = P^{-1}AP, then det⁑(B)=det⁑(P)βˆ’1det⁑(A)det⁑(P)=det⁑(A)\det(B) = \det(P)^{-1}\det(A)\det(P) = \det(A). The determinant is a similarity invariant, hence an invariant of the underlying linear operator TT (independent of the choice of basis).


The determinant as a group homomorphism

RemarkAlgebraic structure

The determinant map det⁑:GLn(F)β†’Fβˆ—\det : \text{GL}_n(F) \to F^* is a group homomorphism from the general linear group to the multiplicative group of FF:

det⁑(AB)=det⁑(A)β‹…det⁑(B).\det(AB) = \det(A) \cdot \det(B).

Its kernel is the special linear group: SLn(F)={A∈GLn(F)∣det⁑(A)=1}\text{SL}_n(F) = \{A \in \text{GL}_n(F) \mid \det(A) = 1\}.

RemarkLooking ahead

The determinant appears throughout linear algebra:

  • Cramer's Rule for solving systems
  • The characteristic polynomial det⁑(Aβˆ’Ξ»I)\det(A - \lambda I) for eigenvalues
  • The Jacobian determinant in multivariable calculus (change of variables)
  • Exterior algebra and differential forms