Linear Algebra

SVD Decomposition

Visualize Singular Value Decomposition as a sequence of rotation, scaling, and rotation transformations.

Singular Value Decomposition (SVD)

Concept Overview

The Singular Value Decomposition (SVD) is one of the most fundamental and widely used matrix factorizations in linear algebra. It states that any linear transformation can be decomposed into three distinct geometric steps: a rotation (or reflection), a coordinate-by-coordinate scaling, and a final rotation (or reflection). Unlike diagonalization, which requires a matrix to be square and not defective, the SVD exists for any rectangular matrix, making it incredibly robust.

Mathematical Definition

Any m×n matrix A can be factored as:

A = U Σ VT
Where:
U is an m×m orthogonal matrix (UTU = I)
Σ is an m×n diagonal matrix with non-negative real numbers on the diagonal
V is an n×n orthogonal matrix (VTV = I)
VT is the transpose of V

Geometrically, applying A to a vector x corresponds to applying these three operations in sequence (from right to left):

  1. VT rotates or reflects the input vector space.
  2. Σ stretches or compresses the rotated space along the coordinate axes. The stretching factors are the singular values.
  3. U rotates or reflects the scaled space into its final position.

Key Concepts

Singular Values

The diagonal entries of Σ, denoted as σi, are called the singular values of A. By convention, they are listed in descending order: σ1 ≥ σ2 ≥ ... ≥ 0. The singular values represent the magnitudes of the principal semi-axes of the hyperellipsoid formed by transforming the unit hypersphere with A. They provide deep insight into the rank and condition number of the matrix. The number of non-zero singular values exactly equals the rank of A.

Left and Right Singular Vectors

The columns of U are the left singular vectors, and the columns of Vare the right singular vectors. They form orthonormal bases for the codomain and domain of the transformation, respectively.

  • The right singular vectors (columns of V) are eigenvectors of the symmetric matrix ATA.
  • The left singular vectors (columns of U) are eigenvectors of the symmetric matrix AAT.
  • The non-zero singular values of A are the square roots of the non-zero eigenvalues of both ATA and AAT.

Historical Context

The SVD has a rich history that developed separately in different fields. It was originally discovered by differential geometers Eugenio Beltrami and Camille Jordan in the 1870s while studying bilinear forms. Later, James Joseph Sylvester (1889), Erhard Schmidt (1907), and Hermann Weyl (1912) independently rediscovered and generalized it.

Despite its early theoretical discovery, the SVD did not become a widespread computational tool until the 1960s. Gene Golub and William Kahan developed the first stable algorithm to compute the SVD directly (without forming ATA, which is numerically unstable), revolutionizing numerical linear algebra and enabling its widespread application in the computer age.

Real-world Applications

  • Principal Component Analysis (PCA): SVD is the preferred numerical method for computing PCA. By subtracting the mean of the data and computing its SVD, the right singular vectors give the principal components, and the singular values give the variance captured by each component.
  • Data Compression: The truncated SVD provides the best low-rank approximation of a matrix (Eckart-Young-Mirsky Theorem). This is used heavily in image compression and signal processing.
  • Recommender Systems: SVD is a fundamental technique in collaborative filtering (such as the algorithm that won the Netflix Prize), used to predict user ratings by uncovering latent factors in an incomplete matrix of user-item interactions.
  • Pseudoinverse and Least Squares: SVD provides a robust way to compute the Moore-Penrose pseudoinverse of any matrix, enabling the solution of least-squares optimization problems even for singular or ill-conditioned matrices.

Related Concepts

  • Diagonalization — a similar but more restrictive factorization A = PDP-1
  • Eigenvalues and Eigenvectors — fundamental properties intricately linked to singular values
  • Orthogonal Projections — concepts underlying the geometry of the orthogonal matrices U and V

Experience it interactively

Adjust parameters, observe in real time, and build deep intuition with Riano’s interactive SVD Decomposition module.

Try SVD Decomposition on Riano →

More in Linear Algebra