Skip to main content

Chapter6Orthogonality

Let us recall one last time the structure of this book:

  1. Solve the matrix equation Ax = b .
  2. Solve the matrix equation Ax = λ x , where λ is a number.
  3. Approximately solve the matrix equation Ax = b .

We have now come to the third part.

Primary Goal

Approximately solve the matrix equation Ax = b .

Finding approximate solutions of equations generally requires computing the closest vector on a subspace to a given vector. This becomes an orthogonality problem: one needs to know which vectors are perpendicular to the subspace.

closestpoint x

First we will define orthogonality and learn to find orthogonal complements of subspaces in Section 6.1 and Section 6.2. The core of this chapter is Section 6.3, in which we discuss the orthogonal projection of a vector onto a subspace; this is a method of calculating the closest vector on a subspace to a given vector. In Section 6.5 we will present the least-squares method of approximately solving systems of equations, and we will give applications to data modeling.

Example

In data modeling, one often asks: “what line is my data supposed to lie on?” This can be solved using a simple application of the least-squares method.

Example

Gauss invented the method of least squares to find a best-fit ellipse: he correctly predicted the (elliptical) orbit of the asteroid Ceres as it passed behind the sun in 1801.