Linear Algebra

Eigenvalues & Eigenvectors

Discovering the special directions that remain unchanged under linear transformations.

Eigenvalues & Eigenvectors

Concept Overview

An eigenvector of a linear transformation is a nonzero vector whose direction remains unchanged (or exactly reversed) when the transformation is applied. The transformation only scales the vector by a factor called the eigenvalue. These special directions reveal the fundamental geometry of linear transformations and appear throughout mathematics, physics, and data science — from quantum mechanics to Google's PageRank algorithm.

Mathematical Definition

Given a square matrix A, a nonzero vector v is an eigenvector with eigenvalue λ if:

Av = λv
Equivalently:
(A − λI)v = 0
This has nonzero solutions when:
det(A − λI) = 0 (characteristic equation)

For a 2×2 matrix, the characteristic equation is a quadratic:

A = [a b; c d]
λ² − (a+d)λ + (ad−bc) = 0
λ² − tr(A)·λ + det(A) = 0

Key Concepts

Geometric Interpretation

In the interactive visualization, eigenvectors are the directions along which the transformation acts as pure scaling. The unit circle transforms into an ellipse, and the eigenvectors align with its principal axes (when eigenvalues are real). The eigenvalue tells you the scale factor:

  • λ > 1: The vector is stretched along that direction
  • 0 < λ < 1: The vector is compressed
  • λ < 0: The vector is reversed and scaled
  • λ = 0: The vector is collapsed to the origin (singular matrix)

Real vs. Complex Eigenvalues

The discriminant of the characteristic equation determines the nature of the eigenvalues:

  • Δ > 0 (two distinct real): Two independent eigenvector directions exist. The transformation stretches or compresses along these directions.
  • Δ = 0 (repeated real): May have one or two independent eigenvectors. Geometrically, the transformation has a single preferred direction.
  • Δ < 0 (complex conjugate pair): No real eigenvectors exist — the transformation involves rotation. Try the "Rotate 90°" preset to see this.

The Spectral Theorem

For symmetric matrices (A = AT), eigenvalues are always real and eigenvectors are orthogonal. This is the spectral theorem — one of the most important results in linear algebra. Try the "Symmetric" preset to see orthogonal eigenvectors. This property is the foundation of Principal Component Analysis (PCA) and many decomposition methods.

Eigendecomposition

If a matrix has n linearly independent eigenvectors, it can be decomposed as:

A = PΛP-1
where:
P = matrix of eigenvectors as columns
Λ = diagonal matrix of eigenvalues

This decomposition makes computing matrix powers trivial: An = PΛnP-1, since raising a diagonal matrix to a power just raises each diagonal entry. This is used extensively in solving systems of differential equations and analyzing dynamical systems.

Historical Context

The concept of eigenvalues emerged in the 18th century through the work of Euler (1748) on rotational motion, but the modern formulation crystallized with Cauchy (1829) and Sylvester, who coined the term "matrix" in 1850. The word "eigen" comes from German, meaning "own" or "characteristic" — eigenvectors are the matrix's own special directions.

David Hilbert's work on infinite-dimensional eigenvalue problems in the early 1900s laid the mathematical foundation for quantum mechanics, where observables are operators and measurement outcomes are their eigenvalues. Today, eigenvalue computation is one of the most important problems in numerical linear algebra.

Real-world Applications

  • Principal Component Analysis (PCA): Eigenvectors of the covariance matrix identify the directions of maximum variance in data, enabling dimensionality reduction.
  • Google PageRank: Web page importance is the dominant eigenvector of the link matrix — the eigenvector corresponding to eigenvalue 1.
  • Quantum mechanics: Observable quantities correspond to eigenvalues of Hermitian operators. Measurement collapses a quantum state onto an eigenstate.
  • Structural engineering: Natural vibration frequencies of structures are eigenvalues of the stiffness-mass matrix system. Resonance occurs when forcing frequency matches an eigenvalue.
  • Stability analysis: The eigenvalues of the Jacobian matrix determine whether equilibrium points of dynamical systems are stable, unstable, or oscillatory.

Related Concepts

  • Linear Transformations — eigenvectors reveal the fundamental structure of any linear transformation; they are the directions where the transformation acts most simply
  • Harmonic Oscillation — normal modes of coupled oscillators are eigenvectors of the system matrix, with eigenvalues giving squared natural frequencies
  • K-Means Clustering — PCA (an eigenvalue method) is often used to reduce dimensionality before clustering
  • Gradient Descent — the eigenvalues of the Hessian matrix determine the curvature of the loss landscape and the optimal learning rate

Experience it interactively

Adjust parameters, observe in real time, and build deep intuition with Riano’s interactive Eigenvalues & Eigenvectors module.

Try Eigenvalues & Eigenvectors on Riano →

More in Linear Algebra