Singular Value Decomposition
The singular value decomposition (SVD) extends the spectral theorem to non-square and non-symmetric matrices. It is arguably the most important matrix factorization in applied mathematics.
Every matrix (real or complex) can be factored as:
where:
- is orthogonal (columns are left singular vectors)
- is diagonal with non-negative entries (the singular values)
- is orthogonal (columns are right singular vectors)
The number of nonzero singular values equals .
The SVD generalizes eigenvalue decomposition: while requires symmetric, works for any matrix by using different bases ( and ) for domain and codomain.
The singular values and vectors of are related to eigenvalues of symmetric matrices:
- Singular values of are
- Right singular vectors (columns of ) are eigenvectors of
- Left singular vectors (columns of ) are eigenvectors of
- and are spectral decompositions
For :
Compute with eigenvalues and eigenvectors forming .
Compute with eigenvalues and eigenvectors forming .
Singular values: , .
Then .
Low-rank approximation: The best rank- approximation to in Frobenius norm is:
Moore-Penrose pseudoinverse: where has entries for nonzero .
Fundamental subspaces:
- spanned by first columns of
- spanned by last columns of
An grayscale image can be stored as a matrix. The full SVD requires values, but the rank- approximation needs only values—huge savings when .
If and we keep only the largest singular values:
This is the optimal -term approximation and forms the basis for JPEG compression.
The SVD is ubiquitous: in data science (dimensionality reduction via truncated SVD), signal processing (noise filtering), control theory (system reduction), and numerical analysis (solving ill-conditioned systems). Its optimality for low-rank approximation makes it the gold standard for matrix compression and denoising. The SVD reveals the "essential structure" of any linear transformation.