Eigenvalues and Eigenvectors
Eigenvalues and eigenvectors reveal the fundamental directions and scaling factors of linear transformations. These concepts are central to understanding dynamical systems, differential equations, and data analysis.
Let be a linear operator on vector space (or let be an matrix). A nonzero vector is an eigenvector of (or ) if there exists a scalar such that:
The scalar is called an eigenvalue of (or ), and is an eigenvector corresponding to .
The set of all eigenvalues is the spectrum of (or ).
Geometrically, eigenvectors are directions that are preserved by the transformation—they're only scaled, not rotated. The eigenvalue tells us the scale factor.
For , find eigenvalues and eigenvectors.
The characteristic equation is :
Eigenvalues: , .
For : Solve :
Eigenvector: (and any scalar multiple).
For : Similar calculation gives .
For an eigenvalue of linear operator (or matrix ), the eigenspace is:
For matrices:
The eigenspace is a subspace containing all eigenvectors for plus the zero vector. Its dimension is the geometric multiplicity of .
The characteristic polynomial of an matrix is:
This is a polynomial of degree in . The eigenvalues of are precisely the roots of (the characteristic equation).
Finding eigenvalues reduces to solving a polynomial equation, which for has no general formula. However, numerical methods efficiently compute eigenvalues for large matrices. The fundamental theorem of algebra guarantees that an matrix over has exactly eigenvalues (counted with multiplicity).