Lesson: Projections

Question 2

1 point

Transcript — Introduction

In this lecture, we return to our purpose for looking at orthogonal complements: to define the projection of a vector v onto a finite-dimensional subspace W of an inner product space V.

Recall that we want to find the projection of v onto W and the perpendicular of v onto W such that v is equal to the projection of v onto W plus the perpendicular of v onto W, with the projection of v onto W being a vector in W and the perpendicular of v onto W being in W-perp. As we can expect, this will be easiest if we have an orthogonal basis. Suppose that the dimension of V is n, {v1 to vk} is an orthogonal basis for W, and {v(k+1) to vn} is an orthogonal basis for the orthogonal complement of W. Then, by Theorem 9.4.2, we have that {v1 to vn} is an orthogonal basis for V. Hence, using our formula for coordinates with respect to an orthogonal basis, we get v = ((the inner product of v and v1) divided by (the length of v1)-squared)v1 + up to ((the inner product of v and vk) divided by (the length of vk)-squared)vk + ((the inner product of v and v(k+1)) divided by (the length of v(k+1))-squared)(v(k+1)) + up to ((the inner product of v and vn) divided by (the length of vn)-squared)vn. Wait, we’re done. The first k vectors in this linear combination are in W, and the next n – k vectors are in W-perp. That is, the first k vectors define the projection, and the rest define the perpendicular.

Definition: Suppose W is a k-dimensional subspace of an inner product space V, and {v1 to vk} is an orthogonal basis for W. Then for any vector v in V, we define the projection of v onto W is equal to ((the inner product of v and v1) divided by (the length of v1)-squared)v1 + up to ((the inner product of v and vk) divided by (the length of vk)-squared)vk, and we define the perpendicular of v onto W as v minus the projection of v onto W.

Note that we have defined the perpendicular onto W differently than in our derivation. This is for two reasons—first, so that we do not need a basis for the orthogonal complement of W, and second, this way it works even if V is infinite-dimensional. However, we do need to verify that this definition is what we want. In particular, we need to verify that v is equal to the projection of v onto W plus the perpendicular of v onto W (which is trivial), and that the perpendicular is a vector in the orthogonal complement of W.

Theorem 9.4.3: Suppose W is a k-dimensional subspace of an inner product space V. Then for any vector v in V, we have the perpendicular of v onto W, which equals v minus the projection of v onto W, is in the orthogonal complement of W. Proof: Observe that if we substitute in the definition of the projection, we get that the perpendicular of v onto W is equal to v minus ((the inner product of v and v1) divided by (the length of v1)-squared)v1 minus up to minus ((the inner product of v and vk) divided by (the length of vk)-squared)vk. Hmm, do you recognize this? We observe that this is just the kth step in the Gram-Schmidt procedure. That is, by the Gram-Schmidt Orthogonalization Theorem, the set {v1 to vk, the perpendicular of v onto W} is an orthogonal set. Therefore, the perpendicular of v onto W is orthogonal to all of v1 to vk, and so the perpendicular of v onto W is in the orthogonal complement of W by Theorem 9.4.1.

An important note: The formula for the projection requires us to have an orthogonal or orthonormal basis for the subspace. For this reason, these are sometimes called orthogonal projections. Be careful when you are doing problems with projections that you do check that you have at least an orthogonal basis for the subspace that you are projecting onto.

Before we look at some examples, I’ll state two more useful theorems. Theorem 9.4.4: If W is a k-dimensional subspace of an inner product space V, then the projection onto W is a linear operator on V with kernel the orthogonal complement of W.

Theorem 9.4.5: If W is a subspace of a finite-dimensional inner product space V, then for any vector v in V, we have the projection of v onto the orthogonal complement of W is equal to the perpendicular of v onto W.

Examples

Example: Let B = {[1; 2; 1], [-1; 1; -1]} be an orthogonal basis for a subspace W of R3, and let x = [2; 1; 3]. Determine the projection of x onto W and the perpendicular of x onto W. Solution: Let v1 = [1; 2; 1] and v2 = [-1; 1; -1]. Then, by definition, we have the projection of x onto W is equal to ((the inner product of x and v1) divided by (the length of v1)-squared)v1 + ((the inner product of x and v2) divided by (the length of v2)-squared)v2, which, using the standard inner product of R3, is equal to (7/6)[1; 2; 1] + (-4/3)[-1; 1; -1], which is equal to [5/2; 1; 5/2]. Then the perpendicular of x onto W is equal to x minus the projection of x onto W, which is [2; 1; 3] – [5/2; 1; 5/2], which is equal to [-1/2; 0; 1/2].

Notice that if we have an orthonormal basis {v1 to vk} for W, then the formula for the projection simplifies to the projection of a vector x onto W is equal to (the inner product of x and v1)v1 + up to (the inner product of x and vk)vk, since the lengths of all of the basis vectors are 1.

Example: Let S be the subspace of M(2-by-2)(R) spanned by {[1/2, 1/2; 1/2, 1/2], [1/2, -1/2; 1/2, -1/2]}, and let A be the matrix [2, 5; -7, 3]. Determine the projection of A onto S and the perpendicular of A onto S. Solution: We can easily verify that an orthonormal basis for S is B = {B1, B2}, which is {[1/2, 1/2; 1/2, 1/2], [1/2, -1/2; 1/2, -1/2]}. Thus, we get the projection of A onto S is equal to (the inner product of A and B1)B1 + (the inner product of A and B2)B2, which we can evaluate to be [-5/2, 4; -5/2, 4], and the perpendicular of A onto S is equal to A minus the projection of A onto S, which is [9/2, 1; -9/2, -1].

Let’s do one more example. Let W = (the span of {1 and x}) be a subspace of P2(R) under the inner product, the inner product of p(x) and q(x) is equal to p(0)q(0) + p(1)q(1) + p(2)q(2). Determine the projection of x-squared onto W. Solution: Notice that we do not have an orthogonal basis for W, and so we first need to apply the Gram-Schmidt procedure to the set {1, x} to find an orthogonal basis for W. Let p1(x) = 1, and then p2(x) is equal to x minus ((the inner product of x and 1) divided by (the length of 1)-squared)(1), which is equal to x – 1. Therefore, our orthogonal basis for W is {1, x – 1}. And hence, we have the projection of x-squared onto W is equal to ((the inner product of x-squared and 1) divided by (the length of 1)-squared)(1) plus ((the inner product of x-squared and (x – 1)) divided by (the length of (x – 1))-squared)(x – 1), which, if we work it out, is equal to 2x – 1/3.

This ends this lecture. In the next lecture, we will use the fact that we can now write any vector in a finite-dimensional inner product space as a sum of a vector in W and a vector in the orthogonal complement of W to do something really cool.

© University of Waterloo and others, Powered by Maplesoft