Wilkinson's Backward Error Analysis Theorem
James Wilkinson's backward error analysis provides a powerful framework for understanding numerical stability by showing that computed solutions are exact solutions to slightly perturbed problems.
Let be nonsingular and let be the computed solution to using Gaussian elimination with partial pivoting in floating point arithmetic. Then there exists a matrix such that: where with a modest polynomial in , and is the growth factor from pivoting (typically in practice, though worst-case ).
This theorem establishes that Gaussian elimination with partial pivoting is backward stable: the computed solution is the exact solution to a problem whose coefficient matrix differs from the original by a relative amount proportional to machine precision.
Interpreting Backward Stability: Backward stability does not guarantee a small forward error . The forward error is bounded by: where is the condition number. An ill-conditioned matrix () amplifies backward error into large forward error.
The proof technique extends beyond linear systems to general numerical algorithms. The key insight is to work backward from the computed result: given output , find input perturbation such that exactly, then bound .
For QR factorization of , Householder reflections produce computed matrices , satisfying: where and is orthogonal to machine precision: .
This backward stability makes QR factorization preferred over forming the normal equations for least squares, since the normal equations square the condition number.
Wilkinson's approach revolutionized numerical analysis by shifting focus from forward error to backward error. A backward stable algorithm guarantees that:
- The computed result is an exact solution to a nearby problem
- The nearby problem differs from the original by amounts within data uncertainty
- Forward error is bounded by problem conditioning times backward error
This framework separates algorithm quality (backward error) from problem difficulty (conditioning). Modern numerical linear algebra prioritizes backward stable algorithms: QR factorization for least squares, singular value decomposition (SVD) for rank-deficient problems, and iterative refinement for ill-conditioned systems.
The theorem also guides algorithm design: achieving backward stability typically requires careful ordering of operations, pivoting strategies, and sometimes compensated arithmetic. For example, Givens rotations and Householder reflections both achieve backward stability for orthogonalization, while classical Gram-Schmidt does not.