You might reasonably wonder: where does this definition come from? And why should I care? We are assuming that you saw at least a basic introduction to eigenvalues in your first course on linear algebra, but that course probably focused on mechanics. Possibly you learned that diagonalizing a matrix lets you compute powers of that matrix.
But why should we be interested in computing powers (in particular, large powers) of a matrix? An important context comes from the study of discrete linear dynamical systems, as well as Markov chains, where the evolution of a state is modelled by repeated multiplication of a vector by a matrix.
When we’re able to diagonalize our matrix using eigenvalues and eigenvectors, not only does it become easy to compute powers of a matrix, it also enables us to see that the entire process is just a linear combination of geometric sequences! If you have completed Section 2.5, you probably will not be surprised to learn that the polynomial roots you found are, in fact, eigenvalues of a suitable matrix.
Eigenvalues and eigenvectors can just as easily be defined for a general linear operator . In this context, an eigenvector is sometimes referred to as a characteristic vector (or characteristic direction) for , since the property simply states that the transformed vector is parallel to the original vector . Some linear algebra textbooks that focus more on general linear transformations frame this topic in the context of invariant subspaces for a linear operator.
A subspace is invariant with respect to if for all . Note that if is an eigenvector of , then is an invariant subspace. To see this, note that if and , then
For the matrix , match each vector on the left with the corresponding eigenvalue on the right. (For typographical reasons, column vectors have been transposed.)
Note that can be defined for any real number , whether or not is an eigenvalue. However, the eigenvalues of are distinguished by the property that there is a nonzero solution to (4.1.1). Furthermore, we know that (4.1.1) can only have nontrivial solutions if the matrix is not invertible. We also know that is non-invertible if and only if . This gives us the following theorem.
To prove a theorem involving a “the following are equivalent” statement, a good strategy is to show that the first implies the second, the second implies the third, and the third implies the first. The ideas needed for the proof are given in the paragraph preceding the theorem. See if you can turn them into a formal proof.
A careful study of eigenvalues and eigenvectors relies heavily on polynomials. An interesting fact is that we can plug any square matrix into a polynomial! Given the polynomial and an matrix , we define
.
Note the use of the identity matrix in the first term, since it doesn’t make sense to add a scalar to a matrix.
One interesting aspect of this is the relationship between the eigenvalues of and the eigenvalues of . For example, if has the eigenvalue , see if you can prove that has the eigenvalue .
In order for certain properties of a matrix to be satisfied, the eigenvalues of need to have particular values. Match each property of a matrix on the left with the corresponding information about the eigenvalues of on the right. Be sure that you can justify your answers with a suitable proof.
Recall that a matrix is said to be similar to a matrix if there exists an invertible matrix such that . Much of what follows concerns the question of whether or not a given matrix is diagonalizable.
The roots of the characteristic polynomial are our eigenvalues, so we have and . Note that the first eigenvalue comes from a repeated root. This is typically where things get interesting. If an eigenvalue does not come from a repeated root, then there will only be one (independent) eigenvector that corresponds to it. (That is, .) If an eigenvalue is repeated, it could have more than one eigenvector, but this is not guaranteed.
The eigenvects command in SymPy takes a square matrix as input, and outputs a list of lists (one list for each eigenvalue). For a given eigenvalue, the corresponding list has the form (eigenvalue, multiplicity, eigenvectors). Using SymPy to solve Example 4.1.11 looks as follows:
Let be a set of linearly independent eigenvectors of a matrix , with corresponding eigenvalues (not necessarily distinct). Extend this set to a basis , and let be the matrix whose columns are the basis vectors. (Note that is necessarily invertible.) Then
We can use Lemma 4.1.13 to prove that as follows. Suppose is a basis for . Then this is a linearly independent set of eigenvectors, so our lemma guarantees the existence of a matrix such that
.
Let . On the one hand, since , we have . On the other hand,
.
This shows that is divisible by . Since is the largest integer such that is divisible by , we must have .
The proof is by induction on the number of distinct eigenvalues. Since eigenvectors are nonzero, any set consisting of a single eigenvector is independent. Suppose, then, that a set of eigenvectors corresponding to distinct eigenvalues is independent, and let be eigenvectors corresponding to distinct eigenvalues .
By hypothesis, the set of eigenvectors is linearly independent. We know that for , since the eigenvalues are all distinct. Therefore, the only way this linear combination can equal zero is if . This leaves us with , but , so as well.
Theorem 4.1.14 tells us that vectors from different eigenspaces are independent. In particular, a union of bases from each eigenspace will be an independent set. Therefore, Theorem 4.1.12 provides an initial criterion for diagonalization: if the dimension of each eigenspace is equal to the multiplicity of , then is diagonalizable.
Our focus in the next section will be on diagonalization of symmetric matrices, and soon we will see that for such matrices, eigenvectors corresponding to different eigenvalues are not just independent, but orthogonal.
Supppose is an invertible matrix and is an eigenvector of with associated eigenvalue . Convince yourself that is an eigenvector of the following matrices, and find the associated eigenvalues.