Skip to main content


Let us recall one last time the structure of this book:

  1. Solve the matrix equation Ax = b .
  2. Solve the matrix equation Ax = λ x , where λ is a number.
  3. Approximately solve the matrix equation Ax = b .

We have now come to the third part.

Primary Goal

Approximately solve the matrix equation Ax = b .

Finding approximate solutions of equations generally requires computing the closest vector on a subspace from a given vector. This becomes an orthogonality problem: one needs to know which vectors are perpendicular to the subspace.

closestpoint x

First we will define orthogonality and learn to find orthogonal complements of subspaces in Section 7.1 and Section 7.2. The core of this chapter is Section 7.3, in which we discuss the orthogonal projection of a vector onto a subspace; this is a method of calculating the closest vector on a subspace to a given vector. These calculations become easier in the presence of an orthogonal set, as we will see in Section 7.4. In Section 7.5 we will present the least-squares method of approximately solving systems of equations, and we will give applications to data modeling.


In data modeling, one often asks: “what line is my data supposed to lie on?” This can be solved using a simple application of the least-squares method.


Gauss invented the method of least squares to find a best-fit ellipse: he correctly predicted the (elliptical) orbit of the asteroid Ceres as it passed behind the sun in 1801.