Understand the relationship between matrix products and compositions of matrix transformations.

Become comfortable doing basic algebra involving matrices.

Recipe: matrix multiplication (two ways).

Picture: composition of transformations.

Vocabulary word:composition.

In this section, we study compositions of transformations: that is, chaining transformations together. The composition of matrix transformations corresponds to a notion of multiplying two matrices together. We also discuss addition and scalar multiplication of transformations and of matrices.

Subsection4.4.1Transformation algebra

In this subsection we describe three operations that one can perform on transformations: addition, scalar multiplication, and composition. In the next subsection, we will translate these operations into the language of matrices, for matrix transformations.

Definition

Let be two transformations. Their sum is the transformation defined by
Note that addition of transformations is only defined when both transformations have the same domain and codomain.

Let be a transformation, and let be a scalar. The scalar product of with is the transformation defined by

The sum of two transformations is another transformation called its value on an input vector is the sum of the outputs of and Similarly, the product of with a scalar is another transformation called its value on an input vector is the vector

Let be transformations and let be scalars. The following properties are easily verified:

In one of the above properties, we used to denote the transformation that is zero on every input vector: for all This is called the zero transformation.

Definition

Let and be transformations. Their composition is the transformation defined by

Composing two transformations means chaining them together: is the transformation that first applies then applies (note the order of operations). More precisely, to evaluate on an input vector first you evaluate then you take this output vector of and use it as an input vector of that is, Of course, this only makes sense when the outputs of are valid inputs of

Here is a picture of the composition as a “machine” that first runs then takes its output and feeds it into there is a similar picture in this subsection in Section 4.1.

Domain and codomain of a composition

In order for to be defined, the codomain of must equal the domain of

Recall from this definition in Section 4.1 that the identity transformation is the transformation defined by for every vector

Properties of composition

Let be transformations and let be a scalar. Suppose that and that in each of the following identities, the domains and the codomains are compatible when necessary for the composition to be defined. The following properties are easily verified:

The final property is called associativity; it simply says that

In other words, both and are the transformation defined by first applying then then

Composition of transformations is not commutative in general. That is, in general, even when both compositions are defined.

In this subsection, we translate the algebra of linear transformations from the previous subsection into the language of matrices. First we need some terminology.

Notation

Let be an matrix. We will generally write for the entry in the th row and the th column. It is called the entry of the matrix.

Definition

The sum of two matrices is the matrix obtained by summing the entries of and individually:
In other words, the entry of is the sum of the entries of and Note that addition of matrices is only defined when both matrices have the same dimensions.

The scalar product of a scalar with a matrix is obtained by scaling all entries of by
In other words, the entry of is times the entry of

Fact

Let be linear transformations with standard matrices respectively, and let be a scalar.

The standard matrix for is

The standard matrix for is

In view of the above fact, the following properties are consequences of the corresponding properties of transformations. They are easily verified directly from the definitions as well.

Properties of addition and scalar multiplication

Let be matrices and let be scalars. Then:

In one of the above properties, we used to denote the matrix whose entries are all zero. This is the standard matrix of the zero transformation, and is called the zero matrix.

Definition(Matrix multiplication)

Let be an matrix and let be an matrix. Denote the columns of by

The product is the matrix with columns

In other words, matrix multiplication is defined column-by-column, or “distributes over the columns of ”

In order for the vectors to be defined, the numbers of rows of has to equal the number of columns of

Dimensions of the matrix product

In order for to be defined, the number of rows of has to equal the number of columns of

matrix and an matrix is an matrix.

If has only one column, then also has one column. A matrix with one column is the same as a vector, so the definition of the matrix product generalizes the definition of the matrix-vector product.

If is a square matrix, then we can multiply it by itself; we define its powers to be

The row-column rule for matrix multiplication

Recall from this definition in Section 3.3 that the product of a row vector and a column vector is the scalar

The following procedure for finding the matrix product is much better adapted to computations by hand; the previous definition is more suitable for proving the theorem below.

Recipe: The row-column rule for matrix multiplication

Let be an matrix, let be an matrix, and let Then the entry of is the th row of times the th column of

Subsection4.4.3Composition and Matrix Multiplication

The point of this subsection is to show that matrix multiplication corresponds to composition of transformations.

Theorem

Let and be linear transformations, and let and be their standard matrices, respectively, so is an matrix and is an matrix. Then is a linear transformation, and its standard matrix is the product

Now that we know that is linear, it makes sense to compute its standard matrix. Let be the standard matrix of so and By this theorem in Section 4.3, the first column of is and the first column of is We have

By definition, the first column of the product is the product of with the first column of which is so

It follows that has the same first column as The same argument as applied to the th standard coordinate vector shows that and have the same th column; since they have the same columns, they are the same matrix.

The theorem justifies our choice of definition of the matrix product. This is the one and only reason that matrix products are defined in this way. To rephrase:

Products and compositions

The matrix of the composition of two linear transformations is the product of the matrices of the transformations.

Recall from this definition in Section 4.3 that the identity transformation is the matrix whose columns are the standard coordinate vectors in The identity matrix is the standard matrix of the identity transformation: that is, for all vectors in

In view of the above theorem, the following properties are consequences of the corresponding properties of transformations.

Properties of matrix multiplication

Let be matrices and let be a scalar. Suppose that has dimensions and that in each of the following identities, the dimensions of and are compatible when necessary for the product to be defined. Then:

Most of the above properties are easily verified directly from the definitions. The associativity property however, is not (try it!). It is much easier to prove by relating matrix multiplication to composition of transformations, and using the obvious fact that composition of transformations is associative.

Although matrix multiplication satisfies many of the properties one would expect, one must be careful when doing matrix arithmetic, as there are several properties that are not satisfied in general.

Matrix multiplication caveats

Matrix multiplication is not commutative: is not usually equal to even when both products are defined and have the same size. See this example.

Matrix multiplication does not satisfy the cancellation law: does not imply even when For example,