TheoremComplete

Spectral Theory and Applications

Spectral theory studies the eigenvalue decomposition of linear operators, connecting abstract algebra to geometry and analysis. Applications range from differential equations to quantum mechanics.

TheoremSpectral Theorem (Preview)

A real symmetric matrix AA (or complex Hermitian matrix) can be orthogonally diagonalized: A=QΛQTA = Q\Lambda Q^T

where QQ is orthogonal (QTQ=IQ^TQ = I) with columns being orthonormal eigenvectors, and Λ\Lambda is diagonal with eigenvalues.

Moreover, all eigenvalues of a real symmetric matrix are real.

This theorem, explored fully in later chapters, is fundamental to many applications including principal component analysis, quantum mechanics, and optimization theory.

ExampleDynamical Systems

Consider the discrete dynamical system xn+1=Axn\mathbf{x}_{n+1} = A\mathbf{x}_n with initial condition x0\mathbf{x}_0.

If A=PDP1A = PDP^{-1} is diagonalizable, then: xn=Anx0=PDnP1x0\mathbf{x}_n = A^n\mathbf{x}_0 = PD^nP^{-1}\mathbf{x}_0

The long-term behavior is determined by the eigenvalues:

  • If all λi<1|\lambda_i| < 1, then xn0\mathbf{x}_n \to \mathbf{0} (stable equilibrium)
  • If any λi>1|\lambda_i| > 1, the system grows unbounded in that direction
  • The dominant eigenvalue (largest λ|\lambda|) controls asymptotic behavior
TheoremDiagonalization of Symmetric Matrices

Let AA be a real n×nn \times n symmetric matrix. Then:

  1. All eigenvalues of AA are real
  2. Eigenvectors corresponding to distinct eigenvalues are orthogonal
  3. AA has nn orthonormal eigenvectors forming a basis for Rn\mathbb{R}^n
  4. AA is orthogonally diagonalizable: A=QΛQTA = Q\Lambda Q^T where QQ is orthogonal

These properties make symmetric matrices the "nicest" class for computations and applications.

ExampleDifferential Equations

The system of ODEs x(t)=Ax(t)\mathbf{x}'(t) = A\mathbf{x}(t) has solution: x(t)=eAtx0\mathbf{x}(t) = e^{At}\mathbf{x}_0

where the matrix exponential is eAt=I+At+(At)22!+(At)33!+e^{At} = I + At + \frac{(At)^2}{2!} + \frac{(At)^3}{3!} + \cdots

If A=PDP1A = PDP^{-1}, then: eAt=PeDtP1e^{At} = Pe^{Dt}P^{-1}

where eDte^{Dt} is diagonal with entries eλite^{\lambda_i t}. The eigenvalues λi\lambda_i determine stability: Re(λi)<0\text{Re}(\lambda_i) < 0 for all ii implies stability.

TheoremPower Method

To find the dominant eigenvalue (largest λ|\lambda|) and its eigenvector:

  1. Start with random vector v0\mathbf{v}_0
  2. Iterate: vk+1=AvkAvk\mathbf{v}_{k+1} = \frac{A\mathbf{v}_k}{\|A\mathbf{v}_k\|} (normalize)
  3. As kk \to \infty, vk\mathbf{v}_k converges to the dominant eigenvector
  4. The Rayleigh quotient vkTAvkvkTvk\frac{\mathbf{v}_k^TA\mathbf{v}_k}{\mathbf{v}_k^T\mathbf{v}_k} converges to the dominant eigenvalue

This algorithm is the basis for PageRank and many numerical eigenvalue methods.

Remark

The spectral perspective—viewing operators through their eigenvalues and eigenvectors—is transformative across mathematics and physics. In quantum mechanics, observables are represented by operators, and measurements yield eigenvalues. In data science, principal component analysis finds eigenvectors of covariance matrices to identify patterns. The eigenvalue decomposition is truly one of the most powerful tools in applied mathematics.