Tensor Operations
Visualize tensor contractions and transformations on multi-dimensional matrices.
Tensor Operations
Concept Overview
Tensors are multi-dimensional arrays that generalize scalars, vectors, and matrices. Operations on tensors form the mathematical backbone of modern physics, engineering, and deep learning. Tensor operations, such as tensor contraction and the tensor product, allow us to manipulate and combine these multi-dimensional objects algebraically, mapping multi-linear relationships between vector spaces.
Mathematical Definition
A tensor's rank (or order) determines its number of dimensions. A scalar is a rank-0 tensor, a vector is rank-1, and a matrix is rank-2. A tensor of rank n in an m-dimensional space has mn components.
The tensor product (outer product) of two vectors u and v creates a rank-2 tensor (matrix) T where each element is defined as:
Tensor contraction is a generalization of the dot product and matrix multiplication. It involves summing over one or more repeated indices. For a rank-3 tensor A and a rank-2 tensor (matrix) B, a contraction over the second index of A and first index of B produces a new rank-3 tensor C:
Key Concepts
- Rank and Dimensions: The rank (order) specifies how many indices are needed to identify a single component. For example, a 3D image might be represented as a rank-3 tensor (height, width, color channels).
- Tensor Product: Combines two tensors of rank n and m into a new tensor of rank n + m. It is non-commutative but associative.
- Tensor Contraction: Reduces the rank of a tensor by 2 by summing over a pair of matching indices. The trace of a matrix (sum of diagonal elements) is a simple contraction of a rank-2 tensor to a scalar (rank-0).
- Slices and Projections: Higher-rank tensors can be visualized or operated on by taking lower-rank "slices" (e.g., treating a rank-3 tensor as an array of matrices).
Historical Context
The word "tensor" was introduced by Woldemar Voigt in 1898 to describe stresses and strains in crystals. The mathematics of tensor calculus was fully developed by Gregorio Ricci-Curbastro and his student Tullio Levi-Civita around 1890 under the title "absolute differential calculus." This framework became famous when Albert Einstein used it to formulate his General Theory of Relativity in 1915, relying heavily on tensor operations to describe the curvature of spacetime.
Real-world Applications
- Deep Learning: Neural networks process data in the form of multidimensional tensors. Operations like convolution and dense layer transformations are executed via massive parallel tensor contractions on GPUs and TPUs.
- Physics (Relativity & Fluid Dynamics): Tensors like the metric tensor, stress-energy tensor, and electromagnetic tensor are used to mathematically model forces, energy flow, and the geometry of spacetime.
- Computer Vision: Images and videos are stored and manipulated as rank-3 (RGB image) or rank-4 (video batch) tensors.
Related Concepts
- Matrix Multiplication: The most common form of a tensor contraction between two rank-2 tensors.
- Vector Spaces: The fundamental mathematical structures from which tensor spaces are built using tensor products.
- Dot Product & Projection: The simplest form of tensor contraction, reducing two rank-1 tensors (vectors) to a rank-0 tensor (scalar).
Experience it interactively
Adjust parameters, observe in real time, and build deep intuition with Riano’s interactive Tensor Operations module.
Try Tensor Operations on Riano →