Orthogonal Diagonalization
The spectral theorem states that symmetric matrices can be orthogonally diagonalized. This canonical form provides optimal coordinates for understanding the matrix's action.
Let be a real symmetric matrix. Then:
- All eigenvalues of are real
- has orthonormal eigenvectors
- is orthogonally diagonalizable: there exists an orthogonal matrix and diagonal matrix such that:
where the columns of are orthonormal eigenvectors and the diagonal entries of are the corresponding eigenvalues.
The complex version: A Hermitian matrix is unitarily diagonalizable: .
This result is remarkable: every symmetric matrix, regardless of repeated eigenvalues, admits an orthonormal eigenvector basis. The orthogonality condition makes computations particularly elegant.
Diagonalize .
Eigenvalues: gives .
Eigenvectors: For : , normalized:
For : , normalized:
Then:
The orthogonal diagonalization can be written as a sum:
where are the orthonormal eigenvectors and are eigenvalues. Each term is a rank-one projection matrix.
This spectral decomposition expresses as a weighted sum of orthogonal projections onto eigenspaces.
If with orthogonal:
- (easy to compute powers)
- (product of eigenvalues)
- (sum of eigenvalues)
- is invertible iff all ; then
- The eigenvectors give principal axes; eigenvalues measure variance along these axes
Orthogonal diagonalization is optimal for computation: the orthogonality condition means we don't need to invert explicitly. Moreover, orthogonal transformations preserve lengths and angles, making them numerically stable. The spectral decomposition perspective—viewing the matrix as a sum of scaled projections—is particularly powerful in applications like principal component analysis.