1
$\begingroup$

I'm having trouble with the proof of the following theorem from Commutative Matrices by D.A. Suprunenko and R.I. Tyshkevich.

Theorem 3, Chapter 1: Let $P$ be any field and $M\subseteq M_n(P)$ a set of pairwise commuting matrices. The space $P^n$ can be represented by the direct sum of $M$-invariant subspaces $Q_j$ with $j=1,\cdots, k$ such that the irreducible parts of each restriction $M|_{Q_{i}} =\{ m|_{Q_i} ~:~ m\in M\}$ are equivalent and for $i\neq j$, the irreducible parts of $M|_{Q_i}$ and $M|_{Q_j}$ are not equivalent.

They use a lemma which I understand to prove this theorem. However at the beginning of the proof they make following claim which I do not follow:

(*) The matrices of $M$ can be simultaneously converted to the following form by a similarity transform. \begin{bmatrix} a_1(m) & &&\\ a_{21}(m) & a_2(m) & &\\ \vdots & & \ddots &\\ a_{s1}(m) & a_{s2}(m) & \cdots & a_s(m) \end{bmatrix} where $m\mapsto a_i(m)$ is an irreducible representation of $M$, for $i=1,\cdots,s$ and the matrix has total size $n\times n$.

I know that over an algebraically closed field, commuting matrices can be simultaneously triangularized, in fact the book proves in the same chapter that a sufficiently large finite extension of $P$ will do. However, it is not obvious to me that commuting matrices over an arbitrary field can be simultaneously converted to a block triangular form. Is the claim (*) true, and why?

$\endgroup$
0

1 Answer 1

2
$\begingroup$

This actually does not even require the matrices to commute. Let me point out here that the mere statement that you can simultaneously block triangularize all the matrices is completely trivial: you can just take a single $n\times n$ block. The real content here is not that you get a block triangularization, but that you can arrange that the diagonal blocks are of this triangularization are all irreducible representations.

To prove such a triangularization exists, let $V\subseteq P^n$ be a minimal nonzero $M$-invariant subspace. By minimality, $V$ is an irreducible representation of $M$. Letting $W$ be a linear complement of $V$ in $P^n$, then decomposing $P^n$ as $W\oplus V$ will represent $M$ a $2\times 2$ lower-triangular block matrices (since they map $V$ to itself) where the bottom right block is an irreducible representation of $M$. By induction on $n$, we can now put the upper left block (i.e., the action of $M$ on the quotient $P^n/V$) in triangular form with irreducible representations on the diagonal, and thus we get a block triangularization of $M$ on all of $P^n$ where all the diagonal blocks are irreducible representations.

(In terms of module theory, this is just saying that if you consider $P^n$ as a module over the algebra generated by $M$, it has a composition series. This is trivial since $P^n$ is finite-dimensional so you can just take any maximal chain of submodules.)

$\endgroup$
3
  • $\begingroup$ Oh I see! So for the base case (2x2), to get the block triangle form would we just write each $m\in M$ with respect to the basis $w_1,\cdots,w_k, v_{k+1},\cdots, v_n$, where $w_1,\cdots,w_k$ is a basis for $W$ and $v_{k+1},\cdots, v_n$ is a basis for $V$? That would give us a matrix \begin{bmatrix} m(w_1)~\cdots m(w_k)~ m(v_{k+1})~\cdots m(v_n) \end{bmatrix} $\endgroup$ Commented Aug 15, 2021 at 17:40
  • $\begingroup$ That's right. (Though, I would say that is the induction step rather than the base case. The induction here is on $n$ and the base case is just $n=0$ where everything is trivial.) $\endgroup$ Commented Aug 15, 2021 at 18:43
  • $\begingroup$ Awesome! I guess I was over thinking it. Thank you so much. $\endgroup$ Commented Aug 15, 2021 at 22:35

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.