Bounded Linear Operators - Key Proof
We present a detailed proof of the spectral theorem for compact self-adjoint operators, one of the cornerstones of spectral theory.
Let be an infinite-dimensional separable Hilbert space and a compact self-adjoint operator. Then there exists an orthonormal basis of consisting of eigenvectors of , with corresponding eigenvalues such that .
Moreover, for any ,
Step 1: Existence of Eigenvalues
Define . Since is self-adjoint, for all .
Choose a sequence with and . The sequence is bounded, so by compactness of , there exists a subsequence (still denoted ) such that for some .
By the parallelogram law and properties of the supremum, we can show that is Cauchy, hence converges to some with . By continuity, .
From and , we get . Setting to be this value (with sign), we have .
Step 2: Orthogonal Decomposition
Let be the orthogonal complement. For any , we have
Thus maps to itself. The restriction is also compact and self-adjoint.
Step 3: Iteration
Repeat Step 1 on to find and , then on , and so on. This produces an orthonormal sequence of eigenvectors with eigenvalues satisfying .
Step 4: Convergence of Eigenvalues
If , then for some , infinitely many . The sequence satisfies but . Since for , we have . Thus has no convergent subsequence, contradicting compactness.
Step 5: Completeness
Let . For , we have for all , so . Thus .
If , then is a nonzero compact self-adjoint operator, which must have a nonzero eigenvalue with eigenvalue . But this contradicts the construction. Therefore .
This theorem extends the diagonalization of symmetric matrices to infinite dimensions. The key difference is that eigenvalues accumulate at zero rather than being finite in number.
This spectral theorem is fundamental to solving integral equations, studying Sturm-Liouville problems, and quantum mechanics.