ProofComplete

Proof: The Spectral Theorem

We prove that every real symmetric matrix can be orthogonally diagonalized, establishing the spectral theorem through induction on dimension.

ProofThe Spectral Theorem for Symmetric Matrices

Theorem: Let AA be a real n×nn \times n symmetric matrix. Then AA has an orthonormal basis of eigenvectors and can be written as A=QΛQTA = Q\Lambda Q^T where QQ is orthogonal.

Proof by induction on nn:

Base case (n=1n = 1): Any 1×11 \times 1 matrix A=[a]A = [a] is already diagonal with eigenvector 11.

Inductive step: Assume the theorem holds for (n1)×(n1)(n-1) \times (n-1) symmetric matrices. Let AA be n×nn \times n symmetric.

Step 1: Existence of a real eigenvalue and eigenvector.

Since AA is real and symmetric, the characteristic polynomial pA(λ)=det(AλI)p_A(\lambda) = \det(A - \lambda I) has real coefficients. Over C\mathbb{C}, it has at least one root λ1\lambda_1.

We showed earlier that this eigenvalue is real (for symmetric matrices). Let v1\mathbf{v}_1 be a corresponding eigenvector with v1=1\|\mathbf{v}_1\| = 1.

Step 2: Extend to orthonormal basis.

Extend {v1}\{\mathbf{v}_1\} to an orthonormal basis {v1,v2,,vn}\{\mathbf{v}_1, \mathbf{v}_2, \ldots, \mathbf{v}_n\} of Rn\mathbb{R}^n using Gram-Schmidt.

Form orthogonal matrix Q1=[v1v2vn]Q_1 = [\mathbf{v}_1 | \mathbf{v}_2 | \cdots | \mathbf{v}_n].

Step 3: Block structure in new basis.

Compute Q1TAQ1Q_1^TAQ_1. The first column of AQ1AQ_1 is: Av1=λ1v1A\mathbf{v}_1 = \lambda_1\mathbf{v}_1

Therefore, the first column of Q1TAQ1Q_1^TAQ_1 is: Q1T(λ1v1)=λ1Q1Tv1=λ1e1=(λ100)Q_1^T(\lambda_1\mathbf{v}_1) = \lambda_1Q_1^T\mathbf{v}_1 = \lambda_1\mathbf{e}_1 = \begin{pmatrix} \lambda_1 \\ 0 \\ \vdots \\ 0 \end{pmatrix}

By symmetry of Q1TAQ1Q_1^TAQ_1 (since (Q1TAQ1)T=Q1TATQ1=Q1TAQ1(Q_1^TAQ_1)^T = Q_1^TA^TQ_1 = Q_1^TAQ_1), the first row is also [λ1,0,,0][\lambda_1, 0, \ldots, 0].

Thus: Q1TAQ1=[λ10T0B]Q_1^TAQ_1 = \begin{bmatrix} \lambda_1 & \mathbf{0}^T \\ \mathbf{0} & B \end{bmatrix}

where BB is (n1)×(n1)(n-1) \times (n-1) and symmetric (as a submatrix of a symmetric matrix).

Step 4: Apply induction to BB.

By the inductive hypothesis, BB has an orthonormal eigenvector basis. There exists orthogonal (n1)×(n1)(n-1) \times (n-1) matrix RR such that: RTBR=Λ2=diag(λ2,,λn)R^TBR = \Lambda_2 = \text{diag}(\lambda_2, \ldots, \lambda_n)

Step 5: Combine the transformations.

Define n×nn \times n orthogonal matrix: Q2=[10T0R]Q_2 = \begin{bmatrix} 1 & \mathbf{0}^T \\ \mathbf{0} & R \end{bmatrix}

Then: Q2T(Q1TAQ1)Q2=[10T0RT][λ10T0B][10T0R]Q_2^T(Q_1^TAQ_1)Q_2 = \begin{bmatrix} 1 & \mathbf{0}^T \\ \mathbf{0} & R^T \end{bmatrix}\begin{bmatrix} \lambda_1 & \mathbf{0}^T \\ \mathbf{0} & B \end{bmatrix}\begin{bmatrix} 1 & \mathbf{0}^T \\ \mathbf{0} & R \end{bmatrix} =[λ10T0RTBR]=[λ10T0Λ2]=Λ= \begin{bmatrix} \lambda_1 & \mathbf{0}^T \\ \mathbf{0} & R^TBR \end{bmatrix} = \begin{bmatrix} \lambda_1 & \mathbf{0}^T \\ \mathbf{0} & \Lambda_2 \end{bmatrix} = \Lambda

Step 6: Construct the full orthogonal diagonalization.

Let Q=Q1Q2Q = Q_1Q_2. Then QQ is orthogonal (product of orthogonal matrices), and: QTAQ=(Q1Q2)TA(Q1Q2)=Q2TQ1TAQ1Q2=ΛQ^TAQ = (Q_1Q_2)^TA(Q_1Q_2) = Q_2^TQ_1^TAQ_1Q_2 = \Lambda

Thus A=QΛQTA = Q\Lambda Q^T.

Step 7: Orthonormality of eigenvectors.

The columns of QQ are orthonormal (since QQ is orthogonal) and are eigenvectors of AA with eigenvalues λ1,,λn\lambda_1, \ldots, \lambda_n (the diagonal entries of Λ\Lambda).

By induction, the spectral theorem holds for all nn. ∎

Remark

This proof elegantly uses induction: we peel off one eigenvector at a time, reducing to a smaller problem. The key insight is that the orthogonal complement of an eigenvector is AA-invariant for symmetric matrices, allowing us to apply induction to a submatrix. This recursive structure reveals why orthonormal eigenvector bases exist: each eigenvector can be chosen orthogonal to previous ones.