Spectral Theory and Applications
Spectral theory studies the eigenvalue decomposition of linear operators, connecting abstract algebra to geometry and analysis. Applications range from differential equations to quantum mechanics.
A real symmetric matrix (or complex Hermitian matrix) can be orthogonally diagonalized:
where is orthogonal () with columns being orthonormal eigenvectors, and is diagonal with eigenvalues.
Moreover, all eigenvalues of a real symmetric matrix are real.
This theorem, explored fully in later chapters, is fundamental to many applications including principal component analysis, quantum mechanics, and optimization theory.
Consider the discrete dynamical system with initial condition .
If is diagonalizable, then:
The long-term behavior is determined by the eigenvalues:
- If all , then (stable equilibrium)
- If any , the system grows unbounded in that direction
- The dominant eigenvalue (largest ) controls asymptotic behavior
Let be a real symmetric matrix. Then:
- All eigenvalues of are real
- Eigenvectors corresponding to distinct eigenvalues are orthogonal
- has orthonormal eigenvectors forming a basis for
- is orthogonally diagonalizable: where is orthogonal
These properties make symmetric matrices the "nicest" class for computations and applications.
The system of ODEs has solution:
where the matrix exponential is
If , then:
where is diagonal with entries . The eigenvalues determine stability: for all implies stability.
To find the dominant eigenvalue (largest ) and its eigenvector:
- Start with random vector
- Iterate: (normalize)
- As , converges to the dominant eigenvector
- The Rayleigh quotient converges to the dominant eigenvalue
This algorithm is the basis for PageRank and many numerical eigenvalue methods.
The spectral perspective—viewing operators through their eigenvalues and eigenvectors—is transformative across mathematics and physics. In quantum mechanics, observables are represented by operators, and measurements yield eigenvalues. In data science, principal component analysis finds eigenvectors of covariance matrices to identify patterns. The eigenvalue decomposition is truly one of the most powerful tools in applied mathematics.