Direct Methods for Linear Systems
Direct methods solve a linear system in a finite number of operations. The cornerstone is Gaussian elimination, which factors into triangular matrices. Understanding pivoting strategies and operation counts is essential for practical implementation.
Gaussian Elimination and LU Factorization
An LU decomposition of is a factorization where is unit lower triangular () and is upper triangular. The system reduces to solving (forward substitution, ) then (back substitution, ). The factorization costs flops.
Partial pivoting selects the largest entry in the current column below the diagonal as pivot: where is a permutation matrix. Complete pivoting selects the largest entry in the entire remaining submatrix: . Partial pivoting suffices in practice and guarantees , giving a growth factor bounded by (though in practice).
Cholesky Factorization
For a symmetric positive definite (SPD) matrix , the Cholesky factorization is where is lower triangular with positive diagonal entries. The algorithm computes and for . The cost is flops, half that of LU. No pivoting is needed: the factorization is always numerically stable for SPD matrices.
For : , , , , , . Thus .
Banded and Sparse Systems
For banded matrices with bandwidth , LU factorization costs instead of . Sparse direct solvers use fill-reducing orderings (minimum degree, nested dissection) to minimize the fill-in (new nonzeros created during factorization). Nested dissection achieves fill-in for 2D problems and for 3D, compared to and respectively for the dense case.
A tridiagonal system is solved in operations by the Thomas algorithm (specialized LU without pivoting). Forward sweep: , . Back substitution: , . This arises in cubic spline interpolation and finite difference methods.