Lesson: A Review of Vectors in \( \mathbb{R}^n \)

Question 2

1 point

Transcript

We began our studies of linear algebra by looking at collections of numbers. Let’s take a moment to review the definition and properties of Rn.

Rn is the set of all vectors of the form [x1 through xn] where each xi is an element of the real numbers. In set notation, we write that Rn is the set of all vectors [x1 through xn] such that x1 through xn are in R.

If a is the vector whose components are x1 through xn, if y is the vector whose components are y1 through yn, and if t is a scalar (that is, t is an element of R), then we define addition of vectors componentwise, saying that the vector (x + y) equals the vector whose components are (x1 + y1) through (xn + yn). We also define scalar multiplication componentwise by saying that tx equals the vector whose components are tx1 through txn.

Once we defined Rn, we noted the following properties of vector addition and scalar multiplication. For all w, x, and y in Rn, and all scalars s and t from R, we have

  1. (x + y) is in Rn—that is, that Rn is closed under addition.
  2. x + y = y + x. That is, addition is commutative.
  3. (x + y) + w = x + (y + w). We say that addition is associative.
  4. There exists a vector 0 in Rn such that z + 0 = z for all z in Rn. We call that the 0 vector.
  5. For each x in Rn, there exists a vector inverse-x in Rn such that x + (inverse-x) = the 0 vector. We call this the additive inverse property.
  6. tx is in Rn, or we say that Rn is closed under scalar multiplication.
  7. s(tx) = (st)x, or we say that scalar multiplication is associative.
  8. (s + t)x = sx + tx. This is a distributive law.

We also have

  1. t(x + y) = tx + ty. This is another distributive law.

And last, we have

  1. Property 10, that 1x = x, which is to say that 1 is the scalar multiplicative identity.

After this, we defined two recurring concepts: spanning sets and linear independence. If S is the subspace of Rn consisting of all possible linear combinations of the vectors v1 through vk in Rn, then S is called a subspace spanned by the set of vectors B = {v1 through vk}, and we say that the set B spans S. The set B is called a spanning set for the subspace S. We denote S by saying that S equals the span of {v1 through vk}, or we could say that it equals the span of B.

For example, determine whether or not the vector [1; 2; 4] is in the span of {[-3; 2; 6], [8; -4; -13]}. To solve this question, we need to determine whether or not there are any solutions to the vector equation t1[-3; 2; 6] + t2[8; -4; -13] = [1; 2; 4]. Breaking this vector equation into its three components, we see that the vector equation has a solution if and only if the following system of linear equations also has a solution: -3t1 + 2t2 = 1, 2t1 – 4t2 = 2, and 6t1 – 13t2 = 4. Now, to determine whether or not the system has a solution, we will write it as an augmented matrix and row reduce, as seen here. The final matrix is in row echelon form, and as it has no bad rows, we know that the system is consistent. And this means that the vector [1; 2; 4] is in the span of {[-3; 2; 6], [8; -4; -13]}.

Now let’s look at linear dependence. A set of vectors {v1 through vk} is said to be linearly dependent if there exist coefficients t1 through tk not all 0 such that the 0 vector equals the sum of t1v1 through tkvk. A set of vectors {v1 through vk} is said to be linearly independent if the only solution to 0 = t1v1 through tkvk is that t1 through tk all equal 0. This is known as the trivial solution.

For example, we could determine whether the set {[1; 2; -1], [1; 4; 7], [-3; -4; 0]} is linearly independent. To do this, we need to see if there are any parameters in the solution of the homogeneous system t1 + t2 – 3t3 = 0, 2t1 + 4t2 – 4t3 = 0, -t1 + 7t2 = 0. To determine this, we row reduce the coefficient matrix, as seen here. The last matrix is in row echelon form, and thus, we see that the rank of the coefficient matrix is 3. Since this is the same as the number of variables, which is the same as the number of vectors, there are no parameters in the general solution, and this means that our set is linearly independent.

Let’s look at another example. Determine whether the set {[0; 3; -2], [1; -7; 6], [3; 0; 4]} is linearly independent. Again, this will come down to determining whether or not there are parameters in the solution of the homogeneous system t2 + 3t3 = 0, 3t1 – 7t2 = 0, -2t1 + 6t2 + 4t3 = 0. To determine this, we again row reduce the coefficient matrix, as seen here. The last matrix is in row echelon form, and thus we see that the rank of the coefficient matrix is 2. This means that there is one parameter in the general solution to the homogeneous system, and this means that our set is not linearly independent, or that is, that our set is linearly dependent.

© University of Waterloo and others, Powered by Maplesoft