MTH 215 — Intro to Linear Algebra
Section 5.1: Eigenvalues and Eigenvectors
The transformation $\vec{x} \mapsto A \vec{x}$ can move vectors in a variety of directions. However, there are special vectors on which the action of $A$ is simple.
For example, consider $A = \mat{rr} 0 & -2 \\ -4 & 2 \rix$. Graph $A \vec{x}$ for each $\vec{x}$ shown.
|
|
|
|
|
|
An eigenvector of an $n\times n$ matrix $A$ is a nonzero! vector $\vec{x}$ such that \[ A \vec{x} \eq \lambda \vec{x} \] for some (potentially zero) scalar $\lambda$.
An eigenvalue $\lambda $ of $A$ is a scalar for which there is a nontrivial solution to $A \vec{x} = \lambda \vec{x}$.
Show that $A = \mat{cc} 1 & 6 \\ 5 & 2 \rix$ has an eigenvalue of $\lambda =7$, and find corresponding eigenvectors.
$\lambda =7$ is an eigenvalue of $A$ if and only if the equation
has a nontrivial solution. That is, if and only if the homogenous equation
has a nontrivial solution. This augmented matrix is
and is row equivalent to
So, eigenvectors corresponding to $\lambda =7$ are
Note: The eigenspace of $A$ corresponding to $\lambda $ is a subspace of $\R^n $.
In the last example, another eigenvalue for $A = \mat{cc} 1 & 6 \\ 5 & 2 \rix$ is $\lambda =-4$. Eigenvectors corresponding to $\lambda =-4$ are \[ \vec{x} \eq \mat{r} x_1 \\ x_2 \rix \eq \]
Suppose $\lambda $ is an eigenvalue of $A$ with corresponding eigenvector $\vec{x}$. Determine an eigenvalue-eigenvector pair of $A^2$ and of $A^3$.
Since $\lambda $ and $\vec{x}$ are an eigenvalue-eigenvector pair, then $A \vec{x} = \lambda \vec{x}$. Multiplying on the left by $A$ yields \[ A^2 \vec{x} \eq \lambda A \vec{x} .\]
Lots of applications in differential or difference equations and signal processing are based on this concept! See problems 41 and 42 in the textbook if you are interested.
What happens when $A$ has a zero eigenvalue?
| $A$ has zero eigenvalue | if and only if | $A \vec{x} = 0 \vec{x}$ |
| if and only if | $A \vec{x} = \vec{0}$ has a nontrivial solution | |
| if and only if | $A \vec{x} = \vec{0}$ has a free variable | |
| if and only if | $A$ does not have a pivot in all columns | |
| if and only if | $A$ is singular |
- Suppose $A$ has an eigenvalue $\lambda$ corresponding to eigenvector $\vec{x}$. Is $-\vec{x}$ also an eigenvector of $A$?
- Now suppose $A$ also has an eigenvalue $-\lambda$. What is a corresponding eigenvector?
Section 5.2: The Characteristic Equation
Recall that $\vec{x}$ is an eigenvector associated with eigenvalue $\lambda$ if $A \vec{x} = \lambda \vec{x}$. Given an eigenvalue $\lambda $, we can find eigenvectors by solving \[ (A - \lambda I) \vec{x} \eq \vec{0} \] for $\vec{x}$. But, how do we find the eigenvalues?
\begin{align*} (A-\lambda I) \vec{x} = \vec{0} & \quad \text{ with } \vec{x} \neq \vec{0} \\ & \quad \text{implies} \\ (A-\lambda I) \vec{x} = \vec{0} & \quad \\ & \quad \text{implies}\\ (A-\lambda I) & \quad \\ & \quad \text{implies} \\ \end{align*}
The algebraic multiplicity of an eigenvalue $\lambda$ is the multiplicity of $\lambda $ as a root in the characteristic polynomial: \[ \algmult(\lambda ) \eq \text{ multiplicity of } \lambda \text{ as a root in } p(\lambda ) .\]
For example, in the last example \begin{align*} \algmult(5) & \eq \\ \algmult(3) & \eq \\ \algmult(1) & \eq \end{align*}
- The characteristic polynomial of $A$, $\Det (A-\lambda I)$, is of degree $n$.
- $A$ has $n$ eigenvalues when counted with algebraic multiplicity. Note: complex eigenvalues are possible, even for real matrices.
- $A$ is invertible if and only if $0$ is not an eigenvalue of $A$.
Computing eigenvalues of a matrix is extremely difficult. Instead, we opt for finding the eigenvalues of a different matrix whose eigenvalues are the same as the original but for which the calculation is easier to do (what matrices come to mind?).
Are the eigenvalues preserved with a similarity transformation? Let $B = P^{-1}AP$. Then \begin{align*} \Det (B-\lambda I) & \eq \Det \pbr{P^{-1}AP - \lambda I} \\ & \eq \Det \pbr{P^{-1} AP - P^{-1} \lambda I P} \\ & \eq \Det \pbr{P^{-1} (A-\lambda I) P} \\ & \eq \Det (P^{-1}) \cdot \Det (A-\lambda I) \cdot \Det (P) \\ & \eq \frac{1}{\Det (P)} \cdot \Det (A-\lambda I) \cdot \Det (P) \\ & \eq \Det (A-\lambda I) \end{align*}
Let $A = \mat{rrr} 2 & 0 & 0 \\ 1 & 2 & 1 \\ -1 & 0 & 1 \rix$, where $A = PBP^{-1}$ for $P = \mat{rrr} 0 & 0 & -1 \\ -1 & 1 & 0 \\ 1 & 0 & 1 \rix, B = \mat{rrr} 1 & 0 & 0 \\ 0 & 2 & 0 \\ 0 & 0 & 2 \rix$.
Find eigenvalues and eigenvectors of $A$.
Also—similarity is not the same as row equivalence. Recall that $A$ and $B$ are row equivalent if there exists an invertible $E$ such that $B = E A$. This is not a similarity transformation!
Section 5.3: Diagonalization
Recall that $n\times n$ matrices $A$ and $B$ are similar if there exists an invertible $n\times n$ matrix $P$ for which \[ A = PBP^{-1} .\] An example in Chapter 2 revealed that \[ A^2 = PB^2P^{-1}, \qquad A^3 = P B^3 P^{-1}, \quad \dots, \quad A^k = P B^k P^{-1} .\]
Perhaps computing $A^k$ is easier with the factorization $A = PBP^{-1}$; in which case $B^k$ needs to be easy to compute. One option is to suppose $B = D$ is a diagonal matrix.
The important question is — When is $A$ diagonalizable? And, how do we find $P$ and $D$?
An $n\times n$ matrix $A$ is diagonalizable if and only if $A$ has $n$ linearly independent eigenvectors.
In such a case, let $\lambda_1, \dots , \lambda_n$ be eigenvalues associated to eigenvectors $\vec{v}_1, \dots , \vec{v}_n$. Then $A = PDP^{-1}$ is given by \[ A \eq \mat{cccc} \vec{v}_1 & \vec{v}_2 & \dots & \vec{v}_n \rix \mat{cccc} \lambda_1 & \\ & \lambda_2 \\ & & \ddots \\ & & & \lambda_n \rix \mat{cccc} \vec{v}_1 & \vec{v}_2 & \dots & \vec{v}_n \rix^{-1} .\]
That is, $A$ is diagonalizable if and only if the eigenvectors of $A$ form a basis for $\R^n$.
Diagonalize $A = \mat{rrr} 2 & 0 & 0 \\ 1 & 2 & 1 \\ -1 & 0 & 1 \rix$, if possible.
Step 1: Find the eigenvalues of $A$:
Step 2: Find linearly independent eigenvectors of $A$ (if possible): \begin{align*} (A-1I) \vec{x} = \vec{0} & \quad \mat{rrr|r} 1 & 0 & 0 & 0 \\ 1 & 1 & 1 & 0 \\ -1 & 0 & 0 & 0 \rix & \longrightarrow \quad \underbrace{\mat{rrr|r} 1 & 0 & 0 & 0 \\ 0 & 1 & 1 & 0 \\ 0 & 0 & 0 & 0 \rix}_{x_3 \text{ free}} & \qquad \vec{x} = \\ (A-2I) \vec{x} = \vec{0} & \quad \mat{rrr|r} 0 & 0 & 0 & 0 \\ 1 & 0 & 1 & 0 \\ -1 & 0 & -1 & 0 \rix & \longrightarrow \quad \underbrace{\mat{rrr|r} 1 & 0 & 1 & 0 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \rix}_{x_2 , x_3 \text{ free} } & \qquad \vec{x} = \end{align*}
Corresponding eigenvectors are then \[ \vec{v}_1 = \qquad \vec{v}_2 = \qquad \vec{v}_3 = \]
Step 3: If you found $n$ linearly independent eigenvectors, form $P$ and $D$: \[ P \eq \qquad D \eq \]
If possible, diagonalize $A = \mat{rrr} 2 & 4 & 6 \\ 0 & 2 & 2 \\ 0 & 0 & 4 \rix $.
Step 1: Find the eigenvalues of $A$:
Step 2: Find linearly independent eigenvectors of $A$ (if possible):
\begin{align*} (A-2I) \vec{x} = \vec{0} & \quad \mat{rrr|r} 0 & 4 & 6 & 0 \\ 0 & 0 & 2 & 0 \\ 0 & 0 & 2 & 0 \rix & \longrightarrow \quad & \underbrace{\mat{rrr|r} 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 0 \rix}_{x_1 \text{ free}} & \qquad \vec{x} = \\ (A-4I) \vec{x} = \vec{0} & \quad \mat{rrr|r} -2 & 4 & 6 & 0 \\ 0 & -2 & 2 & 0 \\ 0 & 0 & -2 & 0 \rix & \longrightarrow \quad & \underbrace{\mat{rrr|r} 1 & 0 & -5 & 0 \\ 0 & 1 & -1 & 0 \\ 0 & 0 & 0 & 0 \rix}_{x_3 \text{ free} } & \qquad \vec{x} = \end{align*}
The following theorem provides sufficient conditions for a matrix to be diagonalizable.
Why? Because eigenvectors corresponding to distinct eigenvalues are linearly independent (See ).
- Is $\mat{rrr} 5 & -8 & 1 \\ 0 & 0 & 7 \\ 0 & 0 & -2\rix$ diagonalizable?
- Is $\mat{rrr} 2 & 4 & 3 \\ -4 & -6 & -3 \\ 3 & 3 & 1 \rix$ diagonalizable? Hint: $p(\lambda ) = -(\lambda -1)(\lambda +2)^2$
If $A$ has some repeated eigenvalues, it may or may not be diagonalizable. The following result determines this. First, we need a definition.
Recall that the algebraic multiplicity, $\algmult(\lambda)$, is the multiplicity of $\lambda $ as a root in the characteristic polynomial.
- For every eigenvalue $\lambda_k$ of $A$, $\geomult(\lambda_k) \le \algmult(\lambda_k)$.
- $A$ is diagonalizable if and only if $\geomult(\lambda_k) = \algmult(\lambda_k)$ for every eigenvalue $\lambda_k$ of $A$.
- Suppose $A$ is diagonalizable and has $p$ distinct eigenvalues $\lambda_1, \dots , \lambda_p$. Let $\cB_1, \dots , \cB_p$ be bases for each corresponding eigenspace. Then the total collection of vectors in $\cB_1, \dots , \cB_p$ form a basis for $\R^n $. That is, these vectors form the matrix $P$.
Its characteristic equation is $p(\lambda ) = (\lambda -5)^2 (\lambda +3)^2$, so eigenvalues are $5$ and $-3$ each with algebraic multiplicity $2$.
One can show that a basis for the eigenspace associated to $\lambda =5$ is \[ \cB_1 = \cbr{\vec{v}_1, \vec{v}_2}, \qquad \text{where} \quad \vec{v}_1 = \mat{r} -8 \\ 4 \\ 1 \\ 0 \rix, \quad \vec{v}_2 = \mat{r} -16 \\ 4 \\ 0 \\ 1 \rix \] and that a basis for the eigenspace of $\lambda =-3$ is \[ \cB_2 = \cbr{\vec{v}_3, \vec{v}_4}, \qquad \text{where} \quad \vec{v}_3 = \mat{r} 0 \\ 0 \\ 1 \\ 0 \rix, \quad \vec{v}_4 = \mat{r} 0 \\ 0\\ 0 \\ 1 \rix .\]
Therefore, $A$ is diagonalizable because $\geomult(5) = 2 = \algmult(5)$ and $\geomult(-3) = 2 = \algmult(-3)$. Namely, \[ A \eq PDP^{-1} \eq \mat{rrrr} -8 & -16 & 0 & 0 \\ 4 & 4 & 0 & 0 \\ 1 & 0 & 1 & 0 \\ 0 & 1 & 0 & 1 \rix \mat{rrrr} 5 & \\ & 5 \\ & & -3 \\ & & & -3 \rix \mat{rrrr} -8 & -16 & 0 & 0 \\ 4 & 4 & 0 & 0 \\ 1 & 0 & 1 & 0 \\ 0 & 1 & 0 & 1 \rix^{-1} .\]