Sometimes the span of a set of vectors is “smaller” than you expect from the number of vectors, as in the picture below. This means that (at least) one of the vectors is redundant: it can be removed without affecting the span. In the present section, we formalize this idea in the notion of linear independence.
Subsection2.5.1The Definition of Linear Independence
A set of vectors is linearly independent if the vector equation
has only the trivial solution The set is linearly dependent otherwise.
In other words, is linearly dependent if there exist numbers not all equal to zero, such that
This is called a linear dependence relation or equation of linear dependence.
Note that linear dependence and linear independence are notions that apply to a collection of vectors. It does not make sense to say things like “this vector is linearly dependent on these other vectors,” or “this matrix is linearly independent.”
If then so is linearly dependent. In the other direction, if with (say), then
It is easy to produce a linear dependence relation if one vector is the zero vector: for instance, if then
After reordering, we may suppose that is linearly dependent, with This means that there is an equation of linear dependence
with at least one of nonzero. This is also an equation of linear dependence among since we can take the coefficients of to all be zero.
With regard to the first fact, note that the zero vector is a multiple of any vector, so it is collinear with any other vector. Hence facts 1 and 2 are consistent with each other.
Subsection2.5.2Criteria for Linear Independence
In this subsection we give two criteria for a set of vectors to be linearly independent. Keep in mind, however, that the actual definition is above.
A set of vectors is linearly dependent if and only if one of the vectors is in the span of the other ones.
Any such vector may be removed without affecting the span.
It is equivalent to show that is linearly dependent if and only if is in for some The “if” implication is an immediate consequence of the previous theorem. Suppose then that is linearly dependent. This means that some is in the span of the others. Choose the largest such We claim that this is in If not, then
with not all of equal to zero. Suppose for simplicity that Then we can rearrange:
This says that is in the span of which contradicts our assumption that is the last vector in the span of the others.
We can rephrase this as follows:
If you make a set of vectors by adding one vector at a time, and if the span got bigger every time you added a vector, then your set is linearly independent.
Subsection2.5.3Pictures of Linear Independence
A set containg one vector is linearly independent when since implies
A set of two noncollinear vectors is linearly independent:
Neither is in the span of the other, so we can apply the first criterion.
Note that three vectors are linearly dependent if and only if they are coplanar. Indeed, is linearly dependent if and only if one vector is in the span of the other two, which is a plane (or a line) (or ).
The four vectors below are linearly dependent: they are the columns of a wide matrix. Note however that is not contained in See this warning.
then the column without a pivot is visibly in the span of the pivot columns:
and the pivot columns are linearly independent:
If the matrix is not in reduced row echelon form, then we row reduce:
The following two vector equations have the same solution set, as they come from row-equivalent matrices:
We conclude that
has only the trivial solution.
Note that it is necessary to row reduce to find which are its pivot columns. However, the span of the columns of the row reduced matrix is generally not equal to the span of the columns of one must use the pivot columns of the original matrix. See theorem in Section 2.7 for a restatement of the above theorem.