Sometimes the span of a set of vectors is “smaller” than you expect from the number of vectors, as in the picture below. This means that (at least) one of the vectors is redundant: it can be removed without affecting the span. In the present section, we formalize this idea in the notion of linear (in)dependence.

Subsection3.5.1The Definition of Linear Independence

Definition

A set of vectors is linearly independent if the vector equation

has only the trivial solution The set is linearly dependent otherwise.

In other words, is linearly dependent if there exist numbers not all equal to zero, such that

This is called a linear dependence relation or equation of linear dependence.

Note that linear (in)dependence is a notion that applies to a collection of vectors, not to a single vector, or to one vector in the presence of some others.

The vectors are linearly independent if and only if the matrix with columns has a pivot in every column, if and only if has only the trivial solution.

Solving the matrix equatiion will either verify that the columns are linearly independent, or will produce a linear dependence relation by substituting any nonzero values for the free variables.

Suppose that has more columns than rows. Then cannot have a pivot in every column (it has at most one pivot per row), so its columns are automatically linearly dependent.

A wide matrix has linearly dependent columns.

For example, four vectors in are automatically linearly dependent. Note that a tall matrix may or may not have linearly independent columns.

Facts about linear independence

Two vectors are linearly dependent if and only if they are collinear, i.e., one is a scalar multiple of the other.

Any set containing the zero vector is linearly dependent.

If a subset of is linearly dependent, then is linearly dependent as well.

If then so is linearly dependent. In the other direction, if with (say), then

It is easy to produce a linear dependence relation if one vector is the zero vector: for instance, if then

After reordering, we may suppose that is linearly dependent, with This means that there is an equation of linear dependence
with at least one of nonzero. This is also an equation of linear dependence among since we can take the coefficients of to all be zero.

With regard to the first fact, note that the zero vector is a multiple of any vector, so it is collinear with any other vector. Hence facts 1 and 2 are consistent with each other.

Subsection3.5.2Criteria for Linear Independence

In this subsection we give several criteria for a set of vectors to be linearly (in)dependent. Keep in mind, however, that the actual definition is above.

Theorem

A set of vectors is linearly dependent if and only if one of the vectors is in the span of the other ones.

Suppose, for instance, that is in so we have an equation like

We can subract from both sides of the equation to get

This is a linear dependence relation.

In the other direction, if we have a linear dependence relation like

then we can move any nonzero term to the left side of the equation and divide by its coefficient:

This shows that is in

We leave it to the reader to generalize this proof for any set of vectors.

Warning

In a linearly dependent set it is not generally true that any vector is in the span of the others, only that at least one of them is. See this figure below.

Theorem

A set of vectors is linearly dependent if and only if we can remove one of the vectors without shrinking the span.

If is linearly dependent, then we know from the above theorem that one vector is in the span of the others; for instance,

In this case, any linear combination of is already a linear combination of

Therefore, is contained in Any linear combination of is also a linear combination of (with the -coefficient equal to zero), so is also contained in and thus they are equal.

In the other direction, suppose that we can remove without shrinking the span of in other words, Since is in (the -coefficient is 1 and the rest are 0), this means that is in so the vectors are linearly dependent by the previous theorem.

We leave it to the reader to generalize this proof for any set of vectors.

The previous theorem makes precise in what sense a set of linearly dependent vectors is redundant.

Increasing Span Criterion

A set of vectors is linearly independent if and only if, for every the vector is not in

It is equivalent to show that is linearly dependent if and only if is in for some The “if” implication is an immediate consequence of the previous theorem. Suppose then that is linearly dependent. This means that some is in the span of the others. Choose the largest such We claim that this is in If not, then

with not all of equal to zero. Suppose for simplicity that Then we can rearrange:

This says that is in the span of which contradicts our assumption that is the last vector in the span of the others.

If you make a set of vectors by adding one vector at a time, and if the span got bigger every time you added a vector, then your set is linearly independent.

Subsection3.5.3Pictures of Linear Independence

A set containg one vector is linearly independent when since implies

A set of two noncollinear vectors is linearly independent:

Neither is in the span of the other, so we can apply the first criterion.

The two vectors below are linearly independent because they are not collinear.

The three vectors below are linearly independent: the span got bigger when we added then again when we added so we can apply the increasing span criterion.

The three coplanar vectors below are linearly dependent:

Note that three vectors are linearly dependent if and only if they are coplanar. Indeed, is linearly dependent if and only if one vector is in the span of the other two, which is a plane (or a line) (or ).

The four vectors below are linearly dependent: they are the columns of a wide matrix. Note however that is not contained in See this warning.

Subsection3.5.4Linear Dependence and Free Variables

In light of this theorem and this criterion, it is natural to ask which columns of a matrix are “redundant”, i.e., which we can remove without affecting the column span.

Theorem

Let be vectors in and consider the matrix

Then we can delete the columns of without pivots (the columns corresponding to the free variables), without changing

The pivot columns are linearly independent, so we cannot delete any more columns.

then the column without a pivot is visibly in the span of the pivot columns:

and the pivot columns are linearly independent:

If the matrix is not in reduced row echelon form, then we row reduce:

The following two vector equations have the same solution set, as they come from row-equivalent matrices:

We conclude that

and that

has only the trivial solution.

Note that it is necessary to row reduce to find which are its pivot columns. However, the span of the columns of the row reduced matrix is generally not equal to the span of the columns of See theorem in Section 3.7 for a restatement of the above theorem.