ConceptComplete

Eigenvalues and Eigenvectors

Eigenvalues and eigenvectors reveal the fundamental directions and scaling factors of linear transformations. These concepts are central to understanding dynamical systems, differential equations, and data analysis.

DefinitionEigenvalue and Eigenvector

Let T:VVT: V \to V be a linear operator on vector space VV (or let AA be an n×nn \times n matrix). A nonzero vector vV\mathbf{v} \in V is an eigenvector of TT (or AA) if there exists a scalar λ\lambda such that: T(v)=λvorAv=λvT(\mathbf{v}) = \lambda\mathbf{v} \quad \text{or} \quad A\mathbf{v} = \lambda\mathbf{v}

The scalar λ\lambda is called an eigenvalue of TT (or AA), and v\mathbf{v} is an eigenvector corresponding to λ\lambda.

The set of all eigenvalues is the spectrum of TT (or AA).

Geometrically, eigenvectors are directions that are preserved by the transformation—they're only scaled, not rotated. The eigenvalue tells us the scale factor.

ExampleComputing Eigenvectors

For A=[3102]A = \begin{bmatrix} 3 & 1 \\ 0 & 2 \end{bmatrix}, find eigenvalues and eigenvectors.

The characteristic equation is det(AλI)=0\det(A - \lambda I) = 0: det[3λ102λ]=(3λ)(2λ)=0\det\begin{bmatrix} 3-\lambda & 1 \\ 0 & 2-\lambda \end{bmatrix} = (3-\lambda)(2-\lambda) = 0

Eigenvalues: λ1=3\lambda_1 = 3, λ2=2\lambda_2 = 2.

For λ1=3\lambda_1 = 3: Solve (A3I)v=0(A - 3I)\mathbf{v} = \mathbf{0}: [0101][v1v2]=0    v2=0\begin{bmatrix} 0 & 1 \\ 0 & -1 \end{bmatrix}\begin{bmatrix} v_1 \\ v_2 \end{bmatrix} = \mathbf{0} \implies v_2 = 0

Eigenvector: v1=(10)\mathbf{v}_1 = \begin{pmatrix} 1 \\ 0 \end{pmatrix} (and any scalar multiple).

For λ2=2\lambda_2 = 2: Similar calculation gives v2=(11)\mathbf{v}_2 = \begin{pmatrix} 1 \\ 1 \end{pmatrix}.

DefinitionEigenspace

For an eigenvalue λ\lambda of linear operator TT (or matrix AA), the eigenspace EλE_\lambda is: Eλ=ker(TλI)={vV:T(v)=λv}E_\lambda = \ker(T - \lambda I) = \{\mathbf{v} \in V : T(\mathbf{v}) = \lambda\mathbf{v}\}

For matrices: Eλ=ker(AλI)={v:Av=λv}E_\lambda = \ker(A - \lambda I) = \{\mathbf{v} : A\mathbf{v} = \lambda\mathbf{v}\}

The eigenspace is a subspace containing all eigenvectors for λ\lambda plus the zero vector. Its dimension is the geometric multiplicity of λ\lambda.

DefinitionCharacteristic Polynomial

The characteristic polynomial of an n×nn \times n matrix AA is: pA(λ)=det(AλI)p_A(\lambda) = \det(A - \lambda I)

This is a polynomial of degree nn in λ\lambda. The eigenvalues of AA are precisely the roots of pA(λ)=0p_A(\lambda) = 0 (the characteristic equation).

Remark

Finding eigenvalues reduces to solving a polynomial equation, which for n5n \geq 5 has no general formula. However, numerical methods efficiently compute eigenvalues for large matrices. The fundamental theorem of algebra guarantees that an n×nn \times n matrix over C\mathbb{C} has exactly nn eigenvalues (counted with multiplicity).