NumPy Linear Algebra: Dot Products, Inverses, and Eigenvalues
Linear algebra is the math behind machine learning, computer graphics, physics simulations, and data science. Every time a game rotates a 3D object, a search engine ranks pages, or a neural network makes a prediction, matrix operations are doing the heavy lifting.
NumPy's linalg module gives you all the essential linear algebra tools: dot products, matrix multiplication, transposes, inverses, determinants, eigenvalues, and solvers for systems of equations. In this tutorial, you'll learn each one with practical examples.
What Is a Dot Product?
A dot product takes two arrays of the same length, multiplies corresponding elements, and sums the results. It's a single number that measures how "aligned" two vectors are. Think of it like a similarity score — two vectors pointing the same direction have a large positive dot product.
A real-world example: imagine you bought 3 apples at $1.50 each, 2 bananas at $0.75, and 5 oranges at $1.00. Your total cost is a dot product of quantities and prices: [3, 2, 5] . [1.50, 0.75, 1.00] = 4.50 + 1.50 + 5.00 = $11.00.
How Does Matrix Multiplication Work?
Matrix multiplication extends the dot product to 2D arrays. For two matrices A (shape m x n) and B (shape n x p), each element in the result is the dot product of a row from A and a column from B. The result has shape (m x p).
import numpy as np
A = np.array([[1, 2], [3, 4]])
B = np.array([[5, 6], [7, 8]])
print(A * B)
# [[ 5 12]
# [21 32]]import numpy as np
A = np.array([[1, 2], [3, 4]])
B = np.array([[5, 6], [7, 8]])
print(A @ B)
# [[19 22]
# [43 50]]What Does Transposing a Matrix Do?
Transposing flips a matrix over its diagonal — rows become columns and columns become rows. A (2, 3) matrix becomes (3, 2). Think of it like rotating a spreadsheet so rows and columns swap.
What Are the Inverse and Determinant?
The inverse of a square matrix A is another matrix A_inv such that A @ A_inv equals the identity matrix (1s on the diagonal, 0s elsewhere). It's like dividing by a matrix. Not all matrices have inverses — only those with a non-zero determinant.
How Do You Solve a System of Equations?
A system of linear equations like:
2x + y = 5
x + 3y = 10
can be written as the matrix equation A @ x = b, where A holds the coefficients, x is the unknowns, and b is the right-hand side. NumPy solves it with np.linalg.solve(A, b).
Here's a practical example: a coffee shop sells lattes for $4 and muffins for $3. On Monday they made $47 from 14 items. How many lattes and muffins did they sell?
What Are Eigenvalues and Eigenvectors?
An eigenvector of a matrix A is a special vector that, when multiplied by A, only gets scaled (stretched or shrunk), not rotated. The scale factor is the eigenvalue. In math: A @ v = lambda * v, where v is the eigenvector and lambda is the eigenvalue.
Eigenvalues power many algorithms: Principal Component Analysis (PCA) for dimensionality reduction, Google's PageRank algorithm, vibration analysis in engineering, and quantum mechanics. They reveal the "natural modes" of a system.
Practice Exercises
A customer buys quantities [4, 2, 6, 1] of items priced at [2.50, 1.75, 0.99, 12.00]. Use the @ operator to compute the total cost and print it.
Compute the matrix product of A and B, then print the result and its shape.
A = np.array([[1, 2, 3],
[4, 5, 6]])
B = np.array([[7, 8],
[9, 10],
[11, 12]])A store sells pens and notebooks. The prices are unknown.
Set up the coefficient matrix and the right-hand side vector, then use np.linalg.solve() to find the price of each item. Print the price of a pen and then the price of a notebook, each on a separate line, rounded to 2 decimal places.
What does the following code print? Remember the determinant formula for a 2x2 matrix: ad - bc.
import numpy as np
M = np.array([[3, 8],
[4, 6]])
print(np.linalg.det(M))Find and print the eigenvalues of the following matrix, sorted from largest to smallest:
A = np.array([[5, 1],
[1, 5]])Use np.sort() with [::-1] to reverse the order.