1
$\begingroup$

Consider an $\large n \times n$ order matrix $\large M$. The $\large i,j$-th entries of the matrix $\large M$, let's say, $\large X_{i,j}$ is an i.i.d random variable ($\large \forall i,j$) following $\large \textrm{Bernoulli }(\frac{1}{2})$. Find expectation of $\large \det(M)$, i.e. find $\large \mathbb{E}(\det(M))$, where $\large \det(M)$ implies determinant of $\large M$.

Note: $\large \det(M)$ is a random quantity.

My first approach :-

$\Large \left| \mathbf{M}^{n \times n} \right| = \begin{vmatrix} X_{11} & X_{12} & X_{13} & \cdots & X_{1n} \\ X_{21} & X_{22} & X_{23} & \cdots & X_{2n} \\ \vdots & & \ddots & \\ X_{(n-1)1} & X_{(n-1)2} & X_{(n-1)3} & \cdots & X_{(n-1)n} \\ X_{n1} & X_{n2} & X_{n3} & \cdots & X_{nn} \notag \end{vmatrix} \\ = \Large \left ( \left[\overset{n}{\underset{k=1}{\sum}} (-1)^{k+1}X_{1k} \left|\mathbf{M}^{\overline{n-1} \times \overline{n-1}} \right| \right] \right) \\ \Large \textrm{Write } \left | \mathbf{M}^{n \times n} \right| \textrm{ as } \mathbf{M_n} \textrm{ and } \left|\mathbf{M}^{\overline{n-1} \times \overline{n-1}} \right| \textrm{ as } \mathbf{M}_{1k} \\ \Large \implies \mathbf{M_n} = \left[\overset{n}{\underset{k=1}{\sum}} (-1)^{k+1}X_{1k} \mathbf{M}_{1k} \right] \\ \Large \implies \mathbb{E} \left ( \mathbf{M_n} \right) = \mathbb{E} \left ( \left[\overset{n}{\underset{k=1}{\sum}} (-1)^{k+1}X_{1k} \mathbf{M}_{1k} \right] \right) \\ \Large = \left[\overset{n}{\underset{k=1}{\sum}} (-1)^{k+1} \mathbb{E} \left( X_{1k} \mathbf{M}_{1k} \right) \right] \\ \Large = \left[\overset{n}{\underset{k=1}{\sum}} (-1)^{k+1} \mathbb{E} \left( X_{1k} \right) \mathbb{E} \left( \mathbf{M}_{1k} \right) \right] \\ \Large \textrm{The minors of } M_{1k} \text{ for } k=1,…,n \textrm{ are all } (n−1)\times(n−1) \textrm{ matrices with i.i.d. Bernoulli }\frac{1}{2} \textrm{ entries.} \\ \Large \textrm{Thus, } \mathbb{E} \left (M_{1k}\right) \textrm{ is the same for all } k.\\ \Large \therefore \textrm{Write } \mathbb{E} \left (M_{1k}\right) \textrm{ as } \mathbb{E}_{n-1} \\ \Large = \frac{1}{2} E_{n-1} \overset{n}{\underset{k=1}{\sum}} \left( -1 \right)^{k+1} \\ \Large = \underset{\text{when n =2m}}{\underbrace{0}}+ \frac{1}{2}\mathbb{E}_{n-1} \qquad [\text{when n-1 =2m}]\\ \Large \implies \mathbb{E}\left ( \mathbf{M_n} \right) = \mathbb{E}_n \qquad [ \textrm{Writing } \mathbb{E}\left ( \mathbf{M_n} \right) \textrm{ as } \mathbb{E}_n] \\ \Large = \frac{1}{2}\mathbb{E}_{n-1} \\ \Large = \frac{1}{2} \begin{cases} 1, & \textrm{ when } & n=2m+1 & m=0\\ 0, & \textrm{ when } & n= 2m+1 & m \ge 1 & [ \because \forall m\ge 1 & \mathbb{E}_{2m+1} = \frac{1}{2}. \mathbb{E}_{2m} = 0] \end{cases} $


My Second approach :-

The determinant is an alternating multilinear form. This means that swapping two rows of the matrix negates the determinant. For $\large n \ge 2$, we can use the symmetry in the distribution of the matrix $\large M$ by swapping any two rows.

Let $\large P$ be a permutation matrix that swaps any two rows. Then,

$\Large \det(PM)=−det(M) \\ \Large \implies \mathbb{E}\left[\det(PM) \right] = - \mathbb{E}\left[\det(M) \right] \qquad \longrightarrow (i)$.

Since the entries of $\large M$ are i.i.d. $\large \textsf{ Bernoulli} (\frac{1}{2})$, the distribution of $\large M$ remains unchanged under row swapping. Specifically, $\large PM$ has the same joint distribution as $\large M$ as swapping rows does not change the i.i.d. nature of the components of $\large M$.

Thus, $\Large PM \overset{\textrm{d}}{=} M \Longrightarrow \mathbb{E}[\det(PM)]=\mathbb{E}[\det(M)] \qquad \longrightarrow (ii) \\ $

Therefore from $(i)$ and $(ii)$ we have,

$\Large \mathbb{E}\left[\det(M) \right] = - \mathbb{E}\left[\det(M) \right] \\ \Large \implies 2\mathbb{E}\left[\det(M) \right] = 0 \\ \Large \therefore \mathbb{E}\left[\det(M) \right] = 0 \qquad \forall n \ge 2$

It is obvious that for $\Large n=1, \mathbb{E}\left[\det(M) \right] = \mathbb{E}\left[X_{11} \right] = \frac{1}{2}$

My questions

  1. Are the first and second approaches syntactically correct?
  2. Suppose if we want to find out the variance, then how do we calculate the variance of $\large M$ ?

Your help is highly appreciated. All comments, suggestions and answers are welcome, helpful and valuable. Thanks for help in advance.

$\endgroup$
3
  • 1
    $\begingroup$ It seems that you made a miscalculation when giving the recursive formula in approach 1. Should it be "$E_n=\begin{cases}0,&n=2m\\\frac{1}{2}E_{n-1},&n=2m+1\end{cases}$"($m\in\mathbb{N}_+$)? Then since $E_2=0$, we should have $E_n=0$($\forall n\geqslant2$). I do not find any mistakes in approach 2. The right answer should be easy to examine by checking $n=2$ and $3$ situations. $\endgroup$ Commented Jul 7 at 20:45
  • $\begingroup$ @JCQ Thank you very much for pointing it out and I need to be more careful. $\endgroup$ Commented Jul 7 at 21:04
  • 1
    $\begingroup$ As for the variance, there already are similar discussions, for example here. Hopefully this helps. $\endgroup$ Commented Jul 7 at 21:51

0

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.