Learn to find complex eigenvalues and eigenvectors of a matrix.

Learn to recognize a rotation-scaling matrix, and compute by how much the matrix rotates and scales.

Understand the geometry of and matrices with a complex eigenvalue.

Recipes: a matrix with a complex eigenvalue is similar to a rotation-scaling matrix, the eigenvector trick for matrices.

Pictures: the geometry of matrices with a complex eigenvalue.

Theorems: the rotation-scaling theorem, the block diagonalization theorem.

Vocabulary word:rotation-scaling matrix.

In Section 5.4, we saw that an matrix whose characteristic polynomial has distinct real roots is diagonalizable: it is similar to a diagonal matrix, which is much simpler to analyze. The other possibility is that a matrix has complex roots, and that is the focus of this section. It turns out that such a matrix is similar (in the case) to a rotation-scaling matrix, which is also relatively easy to understand.

In a certain sense, this entire section is analogous to Section 5.4, with rotation-scaling matrices playing the role of diagonal matrices.

See Appendix A for a review of the complex numbers.

Every matrix has exactly complex eigenvalues, counted with multiplicity.

We can compute a corresponding (complex) eigenvector in exactly the same way as before: by row reducing the matrix Now, however, we have to do arithmetic with complex numbers.

If is a matrix with real entries, then its characteristic polynomial has real coefficients, so this note implies that its complex eigenvalues come in conjugate pairs. In the first example, we notice that

In the second example,

In these cases, an eigenvector for the conjugate eigenvalue is simply the conjugate eigenvector (the eigenvector obtained by conjugating each entry of the first eigenvector). This is always true. Indeed, if then

which exactly says that is an eigenvector of with eigenvalue

Let be a matrix with real entries. If

In other words, both eigenvalues and eigenvectors come in conjugate pairs.

Since it can be tedious to divide by complex numbers while row reducing, it is useful to learn the following trick, which works equally well for matrices with real entries.

Eigenvector Trick for Matrices

Let be a matrix, and let be a (real or complex) eigenvalue. Then

assuming the first row of is nonzero.

Indeed, since is an eigenvalue, we know that is not an invertible matrix. It follows that the rows are collinear (otherwise the determinant is nonzero), so that the second row is automatically a (complex) multiple of the first:

It is obvious that is in the null space of this matrix, as is for that matter. Note that we never had to compute the second row of let alone row reduce!

In this example we found the eigenvectors and for the eigenvalues and respectively, but in this example we found the eigenvectors and for the same eigenvalues of the same matrix. These vectors do not look like multiples of each other at first—but since we now have complex numbers at our disposal, we can see that they actually are multiples:

Subsection5.5.2Rotation-Scaling Matrices

The most important examples of matrices with complex eigenvalues are rotation-scaling matrices, i.e., scalar multiples of rotation matrices.

Definition

A rotation-scaling matrix is a matrix of the form

where and are real numbers, not both equal to zero.

The following proposition justifies the name.

Proposition

Let

be a rotation-scaling matrix. Then:

is a product of a rotation matrix

The scaling factor is

The rotation angle is the counterclockwise angle from the positive -axis to the vector

In other words lies on the unit circle. Therefore, it has the form where is the counterclockwise angle from the positive -axis to the vector or since it is on the same line, to

It follows that

as desired.

For the last statement, we compute the eigenvalues of as the roots of the characteristic polynomial:

Geometrically, a rotation-scaling matrix does exactly what the name says: it rotates and scales (in either order).

The matrix in the second example has second column which is rotated counterclockwise from the positive -axis by an angle of This rotation angle is not equal to The problem is that arctan always outputs values between and it does not account for points in the second or third quadrants. This is why we drew a triangle and used its (positive) edge lengths to compute the angle

Alternatively, we could have observed that lies in the second quadrant, so that the angle in question is

When finding the rotation angle of a vector do not blindly compute since this will give the wrong answer when is in the second or third quadrant. Instead, draw a picture.

Subsection5.5.3Geometry of Matrices with a Complex Eigenvalue

Let be a matrix with a complex, non-real eigenvalue Then also has the eigenvalue In particular, has distinct eigenvalues, so it is diagonalizable using the complex numbers. We often like to think of our matrices as describing transformations of (as opposed to ). Because of this, the following construction is useful. It gives something like a diagonalization, except that all matrices involved have real entries.

Rotation-Scaling Theorem

Let be a real matrix with a complex (non-real) eigenvalue and let be an eigenvector. Then for

In particular, is similar to a rotation-scaling matrix that scales by a factor of

First we need to show that and are linearly independent, since otherwise is not invertible. If not, then there exist real numbers not both equal to zero, such that Then

Now, is also an eigenvector of with eigenvalue as it is a scalar multiple of But we just showed that is a vector with real entries, and any real eigenvector of a real matrix has a real eigenvalue. Therefore, and must be linearly independent after all.

Let and We observe that

On the other hand, we have

Matching real and imaginary parts gives

Now we compute and Since and we have and so

Therefore, and

Since and are linearly independent, they form a basis for Let be any vector in and write Then

This proves that

Here and denote the real and imaginary parts, respectively:

The rotation-scaling matrix in question is the matrix

Geometrically, the rotation-scaling theorem says that a matrix with a complex eigenvalue behaves similarly to a rotation-scaling matrix. See this important note in Section 5.3.

We saw in the above examples that the rotation-scaling theorem can be applied in two different ways to any given matrix: one has to choose one of the two conjugate eigenvalues to work with. Replacing by has the effect of replacing by which just negates all imaginary parts, so we also have for

The matrices and are similar to each other. The only difference between them is the direction of rotation, since and are mirror images of each other over the -axis:

The discussion that follows is closely analogous to the exposition in this subsection in Section 5.4, in which we studied the dynamics of diagonalizable matrices.

Dynamics of a Matrix with a Complex Eigenvalue

Let be a matrix with a complex (non-real) eigenvalue By the rotation-scaling theorem, the matrix is similar to a matrix that rotates by some amount and scales by Hence, rotates around an ellipse and scales by There are three different cases.

when the scaling factor is greater than then vectors tend to get longer, i.e., farther from the origin. In this case, repeatedly multiplying a vector by makes the vector “spiral out”. For example,

gives rise to the following picture:

when the scaling factor is equal to then vectors do not tend to get longer or shorter. In this case, repeatedly multiplying a vector by simply “rotates around an ellipse”. For example,

gives rise to the following picture:

when the scaling factor is less than then vectors tend to get shorter, i.e., closer to the origin. In this case, repeatedly multiplying a vector by makes the vector “spiral in”. For example,

At this point, we can write down the “simplest” possible matrix which is similar to any given matrix There are four cases:

has two real eigenvalues In this case, is diagonalizable, so is similar to the matrix
This representation is unique up to reordering the eigenvalues.

has one real eigenvalue of geometric multiplicity In this case, we saw in this example in Section 5.4 that is equal to the matrix

has one real eigenvalue of geometric multiplicity In this case, is not diagonalizable, and we saw in this remark in Section 5.4 that is similar to the matrix

has no real eigenvalues. In this case, has a complex eigenvalue and is similar to the rotation-scaling matrix
by the rotation-scaling theorem. By this proposition, the eigenvalues of a rotation-scaling matrix are so that two rotation-scaling matrices and are similar if and only if and

Subsection5.5.4Block Diagonalization

For matrices larger than there is a theorem that combines the diagonalization theorem in Section 5.4 and the rotation-scaling theorem. It says essentially that a matrix is similar to a matrix with parts that look like a diagonal matrix, and parts that look like a rotation-scaling matrix.

Block Diagonalization Theorem

Let be a real matrix. Suppose that for each (real or complex) eigenvalue, the algebraic multiplicity equals the geometric multiplicity. Then where and are as follows:

The matrix is block diagonal, where the blocks are blocks containing the real eigenvalues (with their multiplicities), or blocks containing the matrices
for each non-real eigenvalue (with multiplicity).

The columns of form bases for the eigenspaces for the real eigenvectors, or come in pairs for the non-real eigenvectors.

Block Diagonalization of a Matrix with a Complex Eigenvalue

Let be a matrix with a complex eigenvalue Then is another eigenvalue, and there is one real eigenvalue Since there are three distinct eigenvalues, they have algebraic and geometric multiplicity one, so the block diagonalization theorem applies to

Let be a (complex) eigenvector with eigenvalue and let be a (real) eigenvector with eigenvalue Then the block diagonalization theorem says that for