Lesson: The Gram-Schmidt Procedure

Question 2

1 point

Transcript — Introduction

In the last lecture, we saw that orthonormal bases are very useful. We now want to prove that, in fact, every finite-dimensional inner product space has an orthogonal—and hence, orthonormal—basis. We will prove this by construction. That is, in this lecture, we will learn how to use the Gram-Schmidt procedure to turn a basis {w1 to wk} for a subspace W of an inner product space V into an orthogonal basis {v1 to vk} for W.

First, consider the case where W is a 1-dimensional subspace of an inner product space V. Then we have a basis {w1} for W. But w1 is one non-zero vector, and hence, the set containing w1 is orthogonal, so we can just take v1 = w1.

Next, consider the case where W is a 2-dimensional subspace of V, with basis {w1, w2}. Starting as in the case above, we take v1 = w1. We now need to prove that there must exist a vector v2 in W that is orthogonal to v1. As usual, we will prove that it exists by construction. But how do we construct it? We use another good technique in mathematics—we work backwards. That is, we assume that it does exist, and use this to figure out how it must be defined.

Assume that there is a vector v2 in W such that the set {v1, v2} is an orthogonal basis for W. Note that we are trying to define v1 and v2 in terms of w1 and w2. We have w1 = v1, so how about w2? Since w2 is in W, we can write it as a linear combination of the orthogonal basis vectors {v1, v2}. Using our work from the last lecture, we get w2 = ((the inner product of w2 and v1) divided by (the length of v1)-squared)v1 + ((the inner product of w2 and v2) divided by (the length of v2)-squared)v2. Since w2 is not a scalar multiple of w1 = v1, we must have that the inner product of w2 and v2 is not equal to 0. Solving for v2, we get ((the length of v2)-squared divided by (the inner product of w2 and v2))(w2 – ((the inner product of w2 and v1) divided by (the length of v1)-squared)v1). Multiplying by a non-zero scalar does not change orthogonality, so we can take any non-zero scalar multiple of v2. That is, we take v2 = w2 – ((the inner product of w2 and v1) divided by (the length of v1)-squared)v1.

Of course, this work doesn’t prove that v2 exists—we assumed that. But it shows us that if it does exist, then it must have this form. We now need to prove that a vector v2 defined this way does have the desired property. Thus, defining v2 = w2 – ((the inner product of w2 and v1) divided by (the length of v1)-squared)v1 gives the inner product of v1 and v2, with some simplifying, is equal to 0. And consequently, the set {v1, v2} is an orthogonal set of two non-zero vectors in W, and hence, is an orthogonal basis for W.

We can continue to repeat this procedure for 3-, 4-, et cetera, dimensional subspaces W of V, and doing so produces the following result. Theorem 9.3.1, called the Gram-Schmidt Orthogonalization Theorem: Let {w1 to wn} be a basis for an inner product space W. If we define v1 to vn successively as follows: v1 = w1, v2 = w2 – ((the inner product of w2 and v1) divided by (the length of v1)-squared)v1, and for any i from 3 to n, vi = wi – ((the inner product of wi and v1) divided by (the length of v1)-squared)v1 – ((the inner product of wi and v2) divided by (the length of v2)-squared)v2 – up to ((the inner product of wi and v(i-1)) divided by (the length of v(i-1))-squared)(v(i-1)), then the set {v1 to vi} is an orthogonal basis for the span of {w1 to wi} for i from 1 to n.

The algorithm defined in this theorem is called the Gram-Schmidt procedure. I will leave the proof as an exercise. To prove it, you should first notice that the algorithm is recursive. This should tell you what proof technique you should use. This theorem shows that every finite-dimensional inner product space has an orthonormal basis. We will use this fact constantly throughout the rest of the course.

Examples

Let’s look at some examples. Example: Use the Gram-Schmidt procedure to find an orthonormal basis for the subspace of R4 defined by S = the span of {[1; 0; 0; 1], [-1; 1; 1; 0], [1; 1; 1; 1]}. Solution: Call the vectors in the spanning set for S w1, w2, and w3 respectively. We first take v1 = w1. Then according to the Gram-Schmidt Orthogonalization Theorem, the set containing v1 is an orthogonal basis for the span of {w1}. Next, we find that v2 = w2 – ((the inner product of w2 and v1) divided by (the length of v1)-squared)v1, and this equals [-1/2; 1; 1; 1/2]. In our derivation for the Gram-Schmidt procedure, we mentioned that we can take any scalar multiple of the vector, so for simplicity in the next calculations, we multiply this vector by 2 to get v2 = [-1; 2; 2; 1]. Then the set {v1, v2} is an orthogonal basis for the span of {w1, w2}. Notice that there are a lot of calculations involved, and future calculations are based on these previous results. Thus, I strongly recommend that you check your calculations after each step. To check, we verify that, indeed, v1 and v2 are orthogonal. Observe that v1 dot v2 is 0. So we now move on to the next step. We find that w3 – ((the inner product of w3 and v1) divided by (the length of v1)-squared)v1 – ((the inner product of w3 and v2) divided by (the length of v2)-squared)v2 = [2/5; 1/5; 1/5; -2/5]. Thus, we take v3 = [2; 1; 1; -2]. Again, it is wise to check that things are going well. It is easy to verify that v3 is orthogonal to both v1 and v2. We now have that the set {v1, v2, v3} is an orthogonal basis for S.

However, the question asks us to find an orthonormal basis for S. To turn the orthogonal basis into an orthonormal basis, we just need to divide each vector by its length to make them all unit vectors. To save ourselves a little bit of work, we notice that we actually calculated the lengths of v1 and v2 in our previous calculations. We find that an orthonormal basis for S is {[1/(the square root of 2); 0; 0; 1/(the square root of 2)], [-1/(the square root of 10); 2/(the square root of 10); 2/(the square root of 10); 1/(the square root of 10)], and [2/(the square root of 10); 1/(the square root of 10); 1/(the square root of 10); -2/(the square root of 10)]}.

Example: Use the Gram-Schmidt procedure to find an orthogonal basis for the subspace W of M(2-by-2)(R) spanned by {A1, A2, A3, A4} = {[1, 2; -1, 1], [-2, -3; 1, -1], [-3, -1; -2, 2], [7, 0; 0, 0]}. Solution: We take B1 = A1. Then A2 – ((the inner product of A2 and B1) divided by (the length of B1)-squared)B1 = [-4/7, -1/7; -3/7, 3/7]. And so we take B2 = [4, 1; 3, -3]. Next, we find that A3 – ((the inner product of A3 and B1) divided by (the length of B1)-squared)B1 – ((the inner product of A3 and B2) divided by (the length of B2)-squared)B2 is [0, 0; 0, 0]. What happened? Notice that if we solve this equation for A3, we get that A3 is a linear combination of B1 and B2. That is, A3 is a linear combination of A1 and A2. The Gram-Schmidt procedure is so amazing that it automatically detects when one of the vectors in the set is a linear combination of the previous vectors. So, since A3 could be removed from the original spanning set for W, we ignore it, and move to the next vector in the set. We then have that B3 = A4 – ((the inner product of A4 and B1) divided by (the length of B1)-squared)B1 – ((the inner product of A4 and B2) divided by (the length of B2)-squared)B2, which is [14/5, -14/5; -7/5, 7/5]. Thus, {B1, B2, B3} is an orthogonal basis for W.

This ends this lecture. Over the next couple of lectures, we will use orthogonal bases to extend the idea of projections to general inner product spaces.

© University of Waterloo and others, Powered by Maplesoft