In the previous section, questions about the existence of solutions of a linear system led to the concept of the span of a set of vectors. In particular, the span of a set of vectors is the set of vectors for which a solution to the linear system exists.
In this section, we turn to the uniqueness of solutions of a linear system, the second of our two fundamental questions. This will lead us to the concept of linear independence.
Let’s begin by looking at some sets of vectors in . As we saw in the previous section, the span of a set of vectors in will be either a line, a plane, or itself.
We have seen examples where the span of a set of three vectors in is and other examples where the span of three vectors is a plane. We would like to understand the difference between these two situations.
In other words, any linear combination of ,, and may be written as a linear combination using only the vectors and . Since the span of a set of vectors is simply the set of their linear combinations, this shows that
Before exploring this type of behavior more generally, let’s think about it from a geometric point of view. Suppose that we begin with the two vectors and in Example 2.4.1. The span of these two vectors is a plane in , as seen on the left of Figure 2.4.3.
Because the vector is not a linear combination of and , it provides a direction to move that is independent of and . Adding this third vector therefore forms a set whose span is , as seen on the right of Figure 2.4.3.
Similarly, the span of the vectors and in Example 2.4.2 is also a plane. However, the third vector is a linear combination of and , which means that it already lies in the plane formed by and , as seen in Figure 2.4.4. Since we can already move in this direction using just and , adding to the set does not change the span. As a result, it remains a plane.
What distinguishes these two examples is whether one of the vectors is a linear combination of the others, an observation that leads to the following definition.
A set of vectors is called linearly dependent if one of the vectors is a linear combination of the others. Otherwise, the set of vectors is called linearly independent.
Suppose we have five vectors in that form the columns of a matrix having reduced row echelon form
.
Is it possible to write one of the vectors as a linear combination of the others? If so, show explicitly how one vector appears as a linear combination of some of the other vectors. Is this set of vectors linearly dependent or independent?
Suppose we have another set of three vectors in that form the columns of a matrix having reduced row echelon form
.
Is it possible to write one of these vectors ,, as a linear combination of the others? If so, show explicitly how one vector appears as a linear combination of some of the other vectors. Is this set of vectors linearly dependent or independent?
By looking at the pivot positions, how can you determine whether the columns of a matrix are linearly dependent or independent?
If one vector in a set is the zero vector , can the set of vectors be linearly independent?
Suppose a set of vectors in has twelve vectors. Is it possible for this set to be linearly independent?
By now, we should expect that the pivot positions play an important role in determining whether the columns of a matrix are linearly dependent. For instance, suppose we have four vectors and their associated matrix
More generally, the same reasoning implies that a set of vectors is linearly dependent if the associated matrix has a column without a pivot position. Indeed, as illustrated here, a vector corresponding to a column without a pivot position can be expressed as a linear combination of the vectors whose columns do contain pivot positions.
Viewing this as an augmented matrix again, we see that the linear system is inconsistent since there is a pivot in the rightmost column, which means that cannot be expressed as a linear combination of the other vectors. Similarly, cannot be expressed as a linear combination of and . In fact, none of the vectors can be written as a linear combination of the others so this set of vectors is linearly independent.
This condition imposes a constraint on how many vectors we can have in a linearly independent set. Here is an example of the reduced row echelon form of a matrix whose columns form a set of three linearly independent vectors in :
More generally, if is a linearly independent set of vectors in , the associated matrix must have a pivot position in every column. Since every row contains at most one pivot position, the number of columns can be no greater than the number of rows. This means that the number of vectors in a linearly independent set can be no greater than the number of dimensions.
This says, for instance, that any linearly independent set of vectors in can contain no more three vectors. We usually imagine three independent directions, such as up/down, front/back, left/right, in our three-dimensional world. This proposition tells us that there can be no more independent directions.
The proposition above says that a set of vectors in that is linear independent has at most vectors. By comparison, Proposition 2.3.15 says that a set of vectors whose span is has at least vectors.
If is a matrix, we call the equation a homogeneous equation. As we’ll see, the uniqueness of solutions to this equation reflects on the linear independence of the columns of .
This activity shows how the solution space of the homogeneous equation indicates whether the columns of are linearly dependent or independent. First, we know that the equation always has at least one solution, the vector . Any other solution is a nonzero solution.
Therefore, has a column without a pivot position, which tells us that the vectors ,, and are linearly dependent. However, we can also see this fact in another way.
The reduced row echelon matrix tells us that the homogeneous equation has a free variable so that there must be infinitely many solutions. In particular, we have
As this example demonstrates, there are many ways we can view the question of linear independence, some of which are recorded in the following proposition.
At the beginning of the section, we said that this concept addressed the second of our two fundamental questions concerning the uniqueness of solutions to a linear system. It is worth comparing the results of this section with those of the previous one so that the parallels between them become clear.
If the vectors and form a linearly dependent set, must one vector be a scalar multiple of the other?
Suppose that is a linearly independent set of vectors. What can you say about the linear independence or dependence of a subset of these vectors?
Suppose is a linearly independent set of vectors that form the columns of a matrix . If the equation is inconsistent, what can you say about the linear independence or dependence of the set of vectors ?
Given below are some descriptions of sets of vectors that form the columns of a matrix . For each description, give a possible reduced row echelon form for or indicate why there is no set of vectors satisfying the description by stating why the required reduced row echelon matrix cannot exist.
When we explored matrix multiplication in Section 2.2, we saw that some properties that are true for real numbers are not true for matrices. This exercise will investigate that in some more depth.
Suppose that and are two matrices and that . If , what can you say about the linear independence of the columns of ?
Suppose that we have matrices , and such that . We have seen that we cannot generally conclude that . If we assume additionally that is a matrix whose columns are linearly independent, explain why . You may wish to begin by rewriting the equation as .
Suppose that is a linear combination of the vectors and . Explain why SpanSpan.
Consider the vectors
.
Write one of the vectors as a linear combination of the others. Find a set of three vectors whose span is the same as Span.
Are the three vectors you are left with linearly independent? If not, express one of the vectors as a linear combination of the others and find a set of two vectors whose span is the same as Span.
Give a geometric description of Span in as we did in Section 2.3.