TheoremComplete

Properties of Row Operations

Elementary row operations are the fundamental tools for manipulating linear systems. Understanding their properties ensures that our solution methods preserve correctness while transforming matrices into simpler forms.

TheoremRow Equivalence Preserves Solutions

If matrix BB is obtained from matrix AA by a sequence of elementary row operations, then the linear systems Ax=0A\mathbf{x} = \mathbf{0} and Bx=0B\mathbf{x} = \mathbf{0} have exactly the same solution set. We say AA and BB are row equivalent, denoted ABA \sim B.

More generally, for augmented matrices, [Ab][Bc][A|\mathbf{b}] \sim [B|\mathbf{c}] implies that the systems Ax=bA\mathbf{x} = \mathbf{b} and Bx=cB\mathbf{x} = \mathbf{c} have the same solution set.

This theorem is the theoretical foundation for Gaussian elimination. Since each elementary row operation is reversible, row equivalence defines an equivalence relation on the set of matrices, partitioning them into equivalence classes based on their solution sets.

TheoremInvertibility of Row Operations

Each elementary row operation has an inverse operation of the same type:

  1. Row switch RiRjR_i \leftrightarrow R_j is self-inverse
  2. Row multiplication RicRiR_i \to cR_i has inverse Ri1cRiR_i \to \frac{1}{c}R_i
  3. Row addition RiRi+cRjR_i \to R_i + cR_j has inverse RiRicRjR_i \to R_i - cR_j

Therefore, if BB is obtained from AA by row operations, then AA can be recovered from BB by the inverse operations applied in reverse order.

ExampleRow Operation Matrices

Each elementary row operation can be represented by multiplying on the left by an elementary matrix. For example, the operation R2R23R1R_2 \to R_2 - 3R_1 on a 3×33 \times 3 matrix corresponds to left multiplication by:

E=[100310001]E = \begin{bmatrix} 1 & 0 & 0 \\ -3 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix}

Then EAEA equals the result of performing the row operation on AA. The inverse operation R2R2+3R1R_2 \to R_2 + 3R_1 corresponds to E1=[100310001]E^{-1} = \begin{bmatrix} 1 & 0 & 0 \\ 3 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix}.

Remark

The representation of row operations as elementary matrices establishes a profound connection: solving Ax=bA\mathbf{x} = \mathbf{b} by row operations is equivalent to finding elementary matrices E1,E2,,EkE_1, E_2, \ldots, E_k such that EkE2E1A=RREFE_k \cdots E_2 E_1 A = \text{RREF}. This perspective leads directly to the concept of matrix factorization and the LU decomposition.