Orthogonality
Orthogonality is the generalization of perpendicularity to abstract inner product spaces. Two vectors are orthogonal when their inner product vanishes. This simple condition has profound consequences: orthogonal sets are always linearly independent, orthogonal decompositions split spaces into complementary pieces, and orthogonal complements provide canonical complements to subspaces.
Definition
Two vectors in an inner product space are orthogonal (written ) if:
A set of vectors is an orthogonal set if for all .
In with the dot product, the standard basis is orthogonal: (the Kronecker delta).
In : , , .
In : and .
, so .
But and are not standard basis vectors -- orthogonality just means their dot product vanishes.
On with :
for (by orthogonality of Fourier modes).
for all (sine is odd, cosine is even).
for .
This orthogonality is the foundation of Fourier analysis.
Properties of orthogonal sets
If is an orthogonal set of nonzero vectors, then are linearly independent.
Suppose . Take the inner product with :
Since , , so . This holds for all .
, , in .
Check pairwise orthogonality: , , ✓.
These are automatically linearly independent and form a basis of .
If , then .
More generally, if are pairwise orthogonal:
and : , ✓.
This is the classical Pythagorean theorem for right triangles.
Orthogonal complement
For a subset , the orthogonal complement of is:
is always a subspace of , even if is not.
Let (the -axis) in .
(the -plane).
.
in .
.
Check: ✓ and ✓.
The orthogonal complement of the row space of is the null space of :
For : is a -dimensional subspace of .
, which has dimension . And ✓.
Fundamental properties of orthogonal complements
Let be a finite-dimensional inner product space and a subspace.
- (orthogonal direct sum).
- .
- .
- .
- and .
in : , ✓.
For , the four fundamental subspaces are:
- (column space), dimension .
- (left null space), dimension .
- (row space), dimension .
- (null space), dimension .
The orthogonal complement relationships: and .
For : , , , and .
Orthogonal decomposition
If is a subspace of a finite-dimensional inner product space , then every vector can be uniquely written as:
The component is the orthogonal projection of onto .
in , .
.
.
Check: ✓ and ✓.
(the -plane), .
, ✓.
(polynomials of degree ) with .
. The projection of onto :
.
The orthogonal component is , and ✓.
Orthogonal matrices
A real matrix is orthogonal if , equivalently . The columns of form an orthonormal basis.
For complex matrices, the analogous notion is unitary: where .
. Then ✓ (the columns and are orthonormal).
Orthogonal matrices preserve inner products: .
Reflection across the line through : . Then and (orthogonal but not a rotation).
Reflection across the line : , ✓.
. One can verify and , so is a rotation in . The axis of rotation is the eigenvector for .
Summary
Orthogonality provides the geometric backbone of inner product spaces:
- Orthogonal sets are automatically linearly independent.
- Every finite-dimensional inner product space decomposes as .
- The four fundamental subspaces of a matrix are related by orthogonal complement.
- Orthogonal (unitary) matrices preserve inner products and are the "symmetries" of the inner product.
- The Gram--Schmidt process converts any basis to an orthogonal one, and orthogonal projections solve least-squares problems.