Linear algebra is the mathematics of linear transformations and systems of linear equations. At its heart lies the matrix, a rectangular array of numbers that can represent a system of equations, a geometric transformation, a dataset, or a quantum state. Understanding matrix operations is essential for physics, computer science, data science, and virtually every engineering discipline.
Matrix addition and subtraction work element by element, requiring matrices of the same dimensions. Scalar multiplication scales every element. Matrix multiplication is more interesting: to multiply A times B, each element of the result is the dot product of a row from A with a column from B. This is not commutative (AB does not generally equal BA), which surprises many students. The dimensions must also align: an m x n matrix can multiply an n x p matrix to produce an m x p result.
The determinant of a square matrix is a single number that encodes important information. For a 2x2 matrix with entries a, b, c, d, the determinant is ad - bc. For larger matrices, expansion by cofactors reduces the problem recursively. A zero determinant means the matrix is singular (non-invertible), which corresponds to a system of equations having either no solution or infinitely many solutions.
The inverse of a matrix A, written A^(-1), satisfies AA^(-1) = A^(-1)A = I (the identity matrix). For a 2x2 matrix, there is a simple formula: swap the diagonal entries, negate the off-diagonal entries, and divide by the determinant. For larger matrices, methods like Gaussian elimination or the adjugate matrix are used. Our matrix calculator handles these computations for you.
Eigenvalues and eigenvectors are among the most profound concepts in linear algebra. An eigenvector of a matrix A is a nonzero vector v such that Av = lambda*v, where lambda is the eigenvalue. Geometrically, an eigenvector is a direction that the matrix stretches or compresses without rotating. Finding eigenvalues requires solving det(A - lambda*I) = 0, which gives the characteristic polynomial. Our eigenvalue calculator computes these for 2x2 and 3x3 matrices.
Applications of linear algebra are staggering. In computer graphics, every rotation, scaling, and translation is a matrix multiplication. In machine learning, the entire training process of linear regression and neural networks is expressed in matrix notation. In quantum mechanics, the state of a particle is a vector and observables are matrices. In Google’s PageRank algorithm, the ranking of web pages comes from finding the dominant eigenvector of a massive matrix.
The vector operations calculator handles dot products, cross products, magnitudes, and angles between vectors. These operations are building blocks for physics (force decomposition, torque calculation) and geometry (projection, distance from point to line).
To build competence, practice multiplying matrices by hand first, then verify with our calculator. Work through finding determinants and inverses step by step. Once the mechanics feel natural, move on to eigenvalue problems and geometric interpretations.