0
$\begingroup$

Let $A$ be a diagonalizable $n \times n$ matrix, and suppose that $\lambda$ is an eigenvalue of $A$ with multiplicity $n$. I want to show that $A= \lambda I_n$.

I started by saying that the determinant $\det (A- \lambda I_n)$ yields a characteristic polynomial with root $\lambda$. Thus, the eigenvalues of A are just $\lambda$ with multiplicity $n$. To find the eigenvectors now, we need to find the null space of $A- \lambda I_n$.

This is where I get confused. In order for $A = \lambda I_n$ to be true, $A - \lambda I_n = 0$ must also be true. Moreover, let $\vec v$ be an eigenvector of $A$ belonging to an eigenvalue $\lambda$. Then $v ∈ N(A − λI)$. But I'm not sure how to tie these concepts together or solve for the eigenvectors of A.

Any help is greatly appreciated!

$\endgroup$

2 Answers 2

1
$\begingroup$

You have to use the diagonalizability somewhere, otherwise a matrix like $\begin{pmatrix} \lambda & 1 \\ 0 & \lambda\end{pmatrix}$ provides a counterexample, since it has $\lambda$ as all its eigenvalues but is not equal to $\lambda I_n$.

Suppose $A$ is diagonalizable, say $A = P^{-1}DP$. Then we know that $D$ has all the eigenvalues of $A$ on its diagonal, but every eigenvalue is $\lambda$ so $D = \lambda I_n$, and therefore $A = P^{-1}(\lambda I_n) P = \lambda I_n (P^{-1}P) = \lambda I_n$ since the identity matrix commutes with every matrix.

$\endgroup$
12
  • 1
    $\begingroup$ Why is every eigenvalue 1? $\endgroup$ Commented Dec 17, 2020 at 3:59
  • 1
    $\begingroup$ @NicholasRoberts For an upper triangular matrix, the eigenvalues lie along the diagonal, with the correct multiplicity. Since that matrix has only $1$s on the diagonal the result follows. $\endgroup$ Commented Dec 17, 2020 at 4:00
  • 1
    $\begingroup$ I don't see in the OP's question where it is assumed all the eigenvalues are 1. Can you clarify? $\endgroup$ Commented Dec 17, 2020 at 4:02
  • 1
    $\begingroup$ @NicholasRoberts I wanted to provide a counterexample to : "If $A$ is a matrix with all eigenvalues $\lambda$ then $A = \lambda I_n$" without the diagonalizability assumption, which the OP has not used in his attempt. For this, I took $\lambda = 1$ and made the $2 \times 2$ matrix above, which is not diagonalizable, not the identity and has $1$ as all its eigenvalues. $\endgroup$ Commented Dec 17, 2020 at 4:04
  • 1
    $\begingroup$ In your second paragraph you say "but every eigenvalue is 1". I think you meant to say every eigenvalue is $\lambda$. Then your argument is fine. $\endgroup$ Commented Dec 17, 2020 at 4:06
0
$\begingroup$

Once you understand what the hypotheses mean, there is almost nothing left to prove here. A linear operator (and by extension a square matrix) is diagonalisable if the sum of the associated eigenspaces (which is always a direct sum) fills the whole space. The multiplicity of an eigenvalue can have different meanings, but in the diagonalisable case they are all the same, and the multiplicity of $\lambda$ is the dimension$~d(\lambda)$ of the eigenspace $E_\lambda$ for $\lambda$. (In general this is called the geometric multiplicity. Since you mention the characteristic polynomial $\chi$ you seem to be thinking of the algebraic multiplicity, which is the multiplicity of$~\lambda$ as root of$~\chi$; in the diagonalisable case each eigenspace$~E_\lambda$ contributes a factor $(X-\lambda)^{d(\lambda)}$ to$~\chi$ and these contributions give all of$~\chi$, so the algebraic multiplicity of$~\lambda$ is $d(\lambda)$ as well.)

Now given that $A$ of size $n\times n$ is diagonalisable with an eigenvalue$~\lambda$ of multiplicity$~n$ means that $\dim(E_\lambda)=n$ which implies $E_\lambda$ is the whole space $K^n$ on which $A$ acts. But if $A$ acts on the whole space as scalar multiplication by$~\lambda$ (which is how it acts on $E_\lambda$ by definition) this means that $A=\lambda I$.

$\endgroup$

You must log in to answer this question.