Let so by this fact in Section 7.2. Let be a basis for and let be a basis for We showed in the proof of this fact in Section 7.2 that is linearly independent, so it forms a basis for Therefore, we can write
where and Since is orthogonal to the vector is the closest vector to on so this proves that such a decomposition exists.
As for uniqueness, suppose that
for in and in Rearranging gives
Since and are subspaces, the left side of the equation is in and the right side is in Therefore, is in and in so it is orthogonal to itself, which implies Hence and which proves uniqueness.
Let be a subspace of and let be a vector in The expression
for in and in is called the orthogonal decomposition of with respect to and the closest vector is the orthogonal projection of onto
Since is the closest vector on to the distance from to the subspace is the length of the vector from to i.e., the length of To restate:
Closest vector and distance
Let be a subspace of and let be a vector in
The orthogonal projection is the closest vector to in
When has dimension greater than one, computing the orthogonal projection of onto means solving the matrix equation where has columns In other words, we can compute the closest vector by solving a system of linear equations. To be explicit, we state the theorem as a recipe:
Recipe: Compute an orthogonal decomposition
Let and let be the matrix with columns Here is a method to compute the orthogonal decomposition of a vector with respect to
Compute the matrix and the vector
Form the augmented matrix for the matrix equation in the unknown vector and row reduce.
This equation is always consistent; choose one solution Then
In the context of the above theorem, if we start with a basis of then it turns out that the square matrix is automatically invertible! (It is always the case that is square and the equation is consistent, but need not be invertible in general.)
Let be a subspace of let be a basis for and let be the matrix with columns
Then the matrix is invertible, and for all vectors in we have
We will show that which implies invertibility by the invertible matrix theorem in Section 6.1. Suppose that Then so by the theorem. But (the orthogonal decomposition of the zero vector is just so and therefore is in Since the columns of are linearly independent, we have so as desired.
Let be a vector in and let be a solution of Then so
In this subsection, we change perspective and think of the orthogonal projection as a function of This function turns out to be a linear transformation with many nice properties, and is a good example of a linear transformation which is not originally defined as a matrix transformation.
We have to verify the defining properties of linearity in Section 4.3. Let be vectors in and let and be their orthogonal decompositions. Since and are subspaces, the sums and are in and respectively. Therefore, the orthogonal decomposition of is so
Now let be a scalar. Then is in and is in so the orthogonal decomposition of is and therefore,
Since satisfies the two defining properties in Section 4.3, it is a linear transformation.
Any vector in is in the range of because for such vectors. On the other hand, for any vector in the output is in so is the range of
We compute the standard matrix of the orthogonal projection in the same way as for any other transformation: by evaluating on the standard coordinate vectors. In this case, this means projecting the standard coordinate vectors onto the subspace.
As we saw in this example, if you are willing to compute bases for and then this provides a third way of finding the standard matrix for projection onto indeed, if is a basis for and is a basis for then
where the middle matrix in the product is the diagonal matrix with ones and zeros on the diagonal. However, since you already have a basis for it is faster to multiply out the expression as in the corollary.