Lesson: Orthogonal Complements

Question 2

1 point

Transcript — Introduction

In Linear Algebra I, we learned how to find the projection of a vector in Rn onto a line or a plane in Rn. Over the next two lectures, we will use the theory of orthogonal and orthonormal bases that we have developed to generalize the concept of projections to inner product spaces.

We first recall from Linear Algebra I that the idea of a projection of a vector x onto a plane P was to write x as a linear combination of a vector u in the plane and a vector v that is orthogonal to the plane. We called the vector u the projection of x onto P, and we called the vector v the perpendicular of x onto P. We now want to generalize this to the projection of a vector x onto a finite-dimensional subspace W of an inner product space V. That is, we want to write x as a linear combination of a vector u in W and a vector v that is orthogonal to W.

What do we mean by “a vector orthogonal to W”? As with the idea of the normal vector of a plane, we mean that the vector is orthogonal to every vector in W. Definition: Let W be a subspace of an inner product space V. The orthogonal complement of W in V is defined by the orthogonal complement of W (also read as W-perp) is the set of all vectors v in V such that the inner product of v and w equals 0 for all vectors w in W.

Before we look at an example, we state a theorem which will make this a little easier. Theorem 9.4.1: Let {v1 to vk} be a basis for a subspace W of an inner product space V. If x is orthogonal to all of the vectors v1 to vk, then x is in the orthogonal complement of W. This theorem shows us that to check if a vector x is in the orthogonal complement of W, we only need to check if x is orthogonal to the basis vectors.

Example: Let W be the subspace of M(2-by-2)(R) spanned by {[1, 1; 1, 1] and [1, 2; -1, 1]}. Find the orthogonal complement of W. Solution: We want to find all matrices A = [a1, a2; a3, a4] that are orthogonal to both of the vectors in the basis for W. That is, we want the inner product of A and [1, 1; 1, 1] to be 0, and the inner product of A and [1, 2; -1, 1] to be 0. Expanding both of these, we get the homogeneous system of equations a1 + a2 + a3 + a4 = 0 and a1 + 2a2 – a3 + a4 = 0. Solving, we get a1 = -3s – t, a2 = 2s, a3 = s, and a4 = t. Thus, every vector in W-perp has the form A = s[-3, 2; 1, 0] + t[-1, 0; 0, 1]. Hence, {[-3, 2; 1, 0], [-1, 0; 0, 1]} spans the orthogonal complement of W.

Example: Let S equal the span of {x} in P2(R) with inner product defined by the inner product of p and q is p(0)q(0) + p(1)q(1) + p(2)q(2). Find the orthogonal complement of S. Solution: Let p(x) = a + bx + cx-squared be any vector in the orthogonal complement of S. Then we have 0 = the inner product of (a + bx + cx-squared) and x, which equals a(0) + (a + b + c)(1) + (a + 2b + 4c)(2), which equals 3a + 5b + 9c. Hence, we have that a = (-5/3)b – 3c. And so we have p(x) = (-5/3)b – 3c + bx + cx-squared. Factoring out the variables, we get this is equal to b(-5/3 + x) + c(-3 + x-squared). And hence, S-perp is spanned by {-5 + 3x, -3 + x-squared}.

Properties of Orthogonal Complements

We now look at some useful properties of orthogonal complements. Theorem 9.4.2: If W is a finite-dimensional subspace of an inner product space V, then

  1. W-perp is a subspace of V.
  2. If the dimension of V equals n, then the dimension of W-perp is n minus the dimension of W.
  3. If V is finite-dimensional, then the orthogonal complement of the orthogonal complement of W is W.
  4. The only vector in both W and its orthogonal complement is the 0 vector.
  5. If the dimension of V is n, {v1 to vk} is an orthogonal basis for W, and {v(k+1) to vn} is an orthogonal basis for W-perp, then the set {v1 to vk, v(k+1) to vn} is an orthogonal basis for V.

You may have noticed the strange condition on property 3, that V must be finite-dimensional. This condition is actually necessary. That is, in an infinite-dimensional inner product space, we can have that the orthogonal complement of the orthogonal complement of W not equal to W.

The proofs of most of the parts of this theorem are fairly easy, and so I will leave them as exercises. However, I think the proof of property 2 is particularly important, and so let’s do the proof here. Proof of (2): If we look closely at the statement of the theorem, it should remind us of the Dimension Theorem and the Rank-Nullity Theorem. Thus, we will use the same proof technique that we used to prove those.

Let {v1 to vk} be an orthonormal basis for W. Since W is a subspace of V, this means that the set {v1 to vk} is a linearly independent set in V. Thus, using the methods from Linear Algebra I, we can extend this to a basis for all of V. We can then apply the Gram-Schmidt procedure to this basis and normalize, so that we have an orthonormal basis {v1 to vk, v(k+1) to vn} for V. Notice that since v1 to vk were already orthonormal, applying the Gram-Schmidt procedure to these vectors will not change them in any way.

Notice that we have now added n – k vectors, v(k+1) to vn, to the set. We want to prove that the set {v(k+1) to vn} is a basis for the orthogonal complement of W. We first observe that, since the vectors v(k+1) to vn are part of a basis, the set {v(k+1) to vn} must be linearly independent. So it just remains to prove that this set also spans the orthogonal complement of W.

Let x be any vector in the orthogonal complement of W. Since the orthogonal complement of W is a subspace of V by property 1, we get that x is in V. Therefore, we can write x as a linear combination of the orthonormal basis vectors {v1 to vn} for V. In particular, using our formula for coordinates with respect to an orthonormal basis, we get x = (the inner product of x and v1)v1 + up to (the inner product of x and vk)vk + (the inner product of x and v(k+1))(v(k+1)) + up to (the inner product of x and vn)vn. But if x is in the orthogonal complement of W, then it is orthogonal to every vector in W, which includes the basis vectors {v1 to vk}. Thus, we have x = (the inner product of x and v(k+1))(v(k+1)) + up to (the inner product of x and vn)vn. Hence, the orthogonal complement of W is spanned by {v(k+1) to vn}. Consequently, {v(k+1) to vn} is a basis for W-perp, and so the dimension of the orthogonal complement of W is n – k, which is n minus the dimension of W as required.

In the next lecture, we will use what we did here to define projections onto a subspace of an inner product space.

© University of Waterloo and others, Powered by Maplesoft