Learn to interpret similar matrices geoemetrically.
Understand the relationship between the eigenvalues, eigenvectors, and characteristic polynomials of similar matrices.
Recipe: compute in terms of for
Picture: the geometry of similar matrices.
Some matrices are easy to understand. For instance, a diagonal matrix
just scales the coordinates of a vector: The purpose of most of the rest of this chapter is to understand complicated-looking matrices by analyzing to what extent they “behave like” simple matrices. For instance, the matrix
has eigenvalues and with corresponding eigenvectors and Notice that
Using instead of the usual coordinates makes “behave” like a diagonal matrix.
The other case of particular importance will be matrices that “behave” like a rotation matrix: indeed, this will be crucial for understanding Section 6.5 geometrically. See this important note.
In this section, we study in detail the situation when two matrices behave similarly with respect to different coordinate systems. In Section 6.4 and Section 6.5, we will show how to use eigenvalues and eigenvectors to find a simpler matrix that behaves like a given matrix.
We begin with the algebraic definition of similarity.
Two matrices and are similar if there exists an invertible matrix such that
Similarity is a very interesting construction when viewed geometrically. We will see that, roughly, similar matrices do the same thing in different coordinate systems. The reader might want to review -coordinates and nonstandard coordinate grids in Section 3.8 before reading this subsection.
By the invertible matrix theorem in Section 6.1, an matrix is invertible if and only if its columns form a basis for This means we can speak of the -coordinates of a vector in where is the basis of columns of Recall that
Since is the matrix with columns this says that Multiplying both sides by gives To summarize:
Let be an invertible matrix with columns and let a basis for Then for any in we have
This says that changes from the -coordinates to the usual coordinates, and changes from the usual coordinates to the -coordinates.
Suppose that The above observation gives us another way of computing for a vector in Recall that so that multiplying by means first multiplying by then by then by See this example in Section 4.4.
Recipe: Computing in terms of
Suppose that where is an invertible matrix with columns Let a basis for Let be a vector in To compute one does the following:
Multiply by which changes to the -coordinates:
Multiply this by
Interpreting this vector as a -coordinate vector, we multiply it by to change back to the usual coordinates:
To summarize: if then and do the same thing, only in different coordinate systems.
The following example is the heart of this section.
The converse of the fact is false. Indeed, the matrices
both have characteristic polynomial but they are not similar, because the only matrix that is similar to is itself.
Given that similar matrices have the same eigenvalues, one might guess that they have the same eigenvectors as well. Upon reflection, this is not what one should expect: indeed, the eigenvectors should only match up after changing from one coordinate system to another. This is the content of the next fact, remembering that and change between the usual coordinates and the -coordinates.
Suppose that Then
The eigenvalues of / or / are the same.
Suppose that is an eigenvector of with eigenvalue so that Then
so that is an eigenvector of with eigenvalue Likewise if is an eigenvector of with eigenvalue then and we have
so that is an eigenvalue of with eigenvalue
If then takes the -eigenspace of to the -eigenspace of and takes the -eigenspace of to the -eigenspace of
This means that the -axis is the -eigenspace of and the -axis is the -eigenspace of likewise, the “-axis” is the -eigenspace of and the “-axis” is the -eigenspace of This is consistent with the fact, as multiplication by changes into and into