Proof of Spectral Theorem
We prove that every real symmetric matrix is orthogonally diagonalizable. The proof proceeds by induction on the size of the matrix, using three key facts: eigenvalues are real, eigenvectors for distinct eigenvalues are orthogonal, and eigenspaces are invariant under the transpose.
Statement
Let with . Then there exists an orthogonal matrix () such that , where all .
Preliminary lemmas
Let (with real entries) and for some , . We show .
Consider (where ):
Also, since (real matrix) and :
Comparing: . Since : , so .
: . Eigenvalues: . Both real ✓.
: eigenvalues . Real ✓.
Compare with the non-symmetric (): eigenvalues . Not real. Symmetry is essential.
Let , , , .
So . Since : .
, eigenvalues , eigenvectors and .
✓ (orthogonal).
Let and be an eigenvalue. Then is also invariant under .
If (meaning for all ), then for any :
So . This means maps to itself, so restricts to a linear map on .
, eigenvalue with .
(the -plane).
✓ and ✓. The restriction is (still symmetric).
Main proof
Induction on . The case is trivial ( is already diagonal).
Inductive step. Assume the theorem holds for all symmetric matrices of size less than . Let be symmetric.
Step 1: Find an eigenvalue. The characteristic polynomial is a real polynomial of degree . Over , it has at least one root . By Lemma 1, .
Step 2: Find an eigenvector. Since is real, has a nonzero real solution . Normalize: , so .
Step 3: Reduce to a smaller matrix. Let , which has dimension .
By Lemma 3, maps to itself: if , then . The restriction is a linear operator on an -dimensional space, and it is symmetric with respect to the standard inner product restricted to .
Step 4: Choose a basis for . Extend to an orthonormal basis of (using Gram--Schmidt on any extension). Let . Then:
where is the matrix of with respect to . The zeros in the first row and column arise because and for :
for .
By symmetry: for .
Step 5: Apply induction. is symmetric (since is, and the restriction preserves symmetry). By the inductive hypothesis, there exists an orthogonal matrix such that .
Step 6: Combine. Let (orthogonal). Then is orthogonal and:
Worked example of the inductive proof
.
Step 1-2: Characteristic polynomial: .
Eigenvalues: . Eigenvector for : solve . . Row reduce to get , .
Step 3-4: has dimension . Find ONB for : , (orthogonal to and to each other).
.
where is the restriction.
Computing : ... let me just note the eigenvalues of must be and (the remaining eigenvalues of ).
Step 5-6: Diagonalize (a symmetric matrix) to get , then combine.
Alternative proof via Schur decomposition
Every complex matrix has a Schur decomposition: where is unitary and is upper triangular with the eigenvalues on the diagonal.
If (Hermitian), then satisfies . An upper triangular matrix that is also Hermitian must be diagonal (the off-diagonal entries with must equal ). So .
For the real case: the real Schur decomposition gives with orthogonal and quasi-upper-triangular (block upper triangular with and blocks on the diagonal). If , symmetry forces to be symmetric, hence diagonal. So .
.
Schur decomposition over : since is symmetric, the Schur form is diagonal. with .
Compare with a non-symmetric matrix : the Schur form is (still upper triangular, not diagonal).
Verification examples
, eigenvalues .
for , for .
, ✓.
✓.
: eigenvalue has , . Any ONB of works (e.g., ). , ✓.
A non-diagonal example with repeated eigenvalue: . Eigenvalues: (once) and (twice). Eigenvector for : . Eigenspace for : , a -dimensional space. Choose ONB: and .
. Eigenvalues: . So and . Both negative (negative definite ✓).
Eigenvectors: for and for .
, ✓.
where . Then has eigenvalues: (with multiplicity , eigenspace ) and (with multiplicity , eigenspace ).
in the basis (any ONB starting with ).
Why the theorem fails for non-symmetric matrices
: not symmetric. Eigenvalues , eigenvectors and .
. The eigenvectors are not orthogonal.
is diagonalizable (distinct eigenvalues), but not orthogonally diagonalizable. Orthogonal diagonalizability requires .
: , and is not even diagonalizable (single eigenvalue with ). This cannot happen for symmetric matrices -- the spectral theorem guarantees full diagonalizability.
Summary
The Spectral Theorem admits two clean proofs:
Inductive proof: Peel off one eigenvector at a time. The key is that the orthogonal complement of an eigenspace is invariant under a symmetric operator, allowing the induction to proceed.
Schur decomposition proof: Start with the existence of the Schur form (upper triangular, unitarily similar to ). Symmetry forces the upper triangular matrix to be diagonal.
Both proofs use the same essential fact: real eigenvalues (from ) and orthogonality of eigenspaces (from ). The result is the most perfect form a matrix can take: a real diagonal matrix in an orthonormal eigenbasis.