Linear Algebra

Matrix Multiplication

Visualize how 2x2 matrices transform space and combine through multiplication.

Matrix Multiplication

Concept Overview

Matrix multiplication is a fundamental operation in linear algebra that corresponds to the composition of linear transformations. When we multiply two matrices, A and B, to get a new matrix C (C = AB), applying the transformation C to a vector is mathematically equivalent to first applying transformation B, and then applying transformation A to the result. This concept is crucial for understanding how complex transformations can be built from simpler ones in computer graphics, machine learning, and physics.

Mathematical Definition

For a 2x2 matrix A and a 2x2 matrix B, their product C = AB is also a 2x2 matrix. The element in the i-th row and j-th column of C is the dot product of the i-th row of A and the j-th column of B.

A = [ a  b ]
    [ c  d ]

B = [ e  f ]
    [ g  h ]

C = A × B
  = [ a*e + b*g   a*f + b*h ]
    [ c*e + d*g   c*f + d*h ]

In index notation, for matrices of appropriate dimensions, an element of the product matrix C is defined as:

Ci,j = Σk=1n Ai,k Bk,j

Key Concepts

  • Non-Commutativity

    In general, matrix multiplication is not commutative. This means that A × B ≠ B × A. Geometrically, applying transformation B then A often results in a different final state than applying A then B.
  • Associativity

    Matrix multiplication is associative: (A × B) × C = A × (B × C). This allows us to group transformations without changing the final result.
  • Identity Matrix

    The identity matrix I acts as a multiplicative identity, meaning A × I = I × A = A. It corresponds to a transformation that does nothing (leaves space unchanged).
  • Dimensions Rule

    To multiply matrix A by matrix B, the number of columns in A must equal the number of rows in B. If A is an m × n matrix and B is an n × p matrix, the product AB is an m × p matrix.

Historical Context

The formalization of matrix multiplication is largely credited to the French mathematician Jacques Hadamard and the British mathematician Arthur Cayley in the mid-19th century. Cayley introduced matrix multiplication in 1855 to represent the composition of linear substitutions. This algebraic shorthand drastically simplified the notation required for systems of linear equations and laid the groundwork for abstract algebra and modern linear algebra.

Real-world Applications

  • Computer Graphics: Used extensively to apply 2D and 3D transformations (translation, rotation, scaling) to objects. Multiple transformations are combined into a single matrix via multiplication to optimize rendering pipelines.
  • Machine Learning: Neural networks rely heavily on matrix multiplication to compute the weighted sum of inputs across many layers efficiently, often accelerated by GPUs.
  • Quantum Mechanics: Quantum states are represented as vectors, and observable quantities and time evolution are represented as operators (matrices). The outcome of successive measurements involves matrix multiplication.
  • Cryptography: Algorithms like the Hill cipher use matrix multiplication for encoding and decoding messages.

Related Concepts

  • Determinant & Area: The determinant of a product matrix (det(AB)) equals the product of their determinants (det(A) * det(B)), representing how the area scaling factors combine.
  • Vector Spaces: Matrices act as linear maps between vector spaces.
  • Systems of Linear Equations: Represented compactly as Ax = b, where A is the coefficient matrix, x is the variable vector, and b is the constant vector.

Experience it interactively

Adjust parameters, observe in real time, and build deep intuition with Riano’s interactive Matrix Multiplication module.

Try Matrix Multiplication on Riano →

More in Linear Algebra