0
$\begingroup$

The question might be easy, but I need a confirmation.

Imagine that we have a vector $(X_1,...,X_n)$ of $n$ uncorrelated (not independent) identically distributed random variables. Now imagine we have one realization $(x_1,...,x_n)$ of these $n$ random variables. Can we use this realization $(x_1,...,x_n)$ to estimate the mean of $X_i,~\forall i$? In other words, can we use samples from uncorrelated identically distributed random variables to estimate the mean (for example, using the sample mean)?

$\endgroup$

1 Answer 1

1
$\begingroup$

You can estimate the mean using any identically distributed random variables, they don’t even have to be uncorrelated. By linearity of expectation (which relies neither on independence, nor on absence of correlation), the expectation of $S_n=\frac1n\sum_{i=1}^nX_i$ is the mean of the distribution, and thus $S_n$ is an unbiased estimator of the mean.

The advantage in having independent variables is that it makes the variance of the estimator lower. In the extreme case were all $X_i$ are in fact identical, we have $S_n=X_1$, so the variance of $S_n$ is just the variance of the distribution, with no improvement from the $n$ samples (unsurprisingly). On the other extreme, if the $X_i$ are independent, the variance of $S_n$ is only $\frac1n$ times the variance of the distribution.

This variance reduction only relies on the lack of correlation and is the same in your case, when the variables are uncorrelated, without the assumption of independence:

\begin{eqnarray} \operatorname{Var}\left(\frac1n\sum X_i\right) &=& E\left[\left(\frac1n\sum X_i\right)^2\right]-E\left[\frac1n\sum X_i\right]^2 \\ &=& \frac1{n^2}\sum_{i,j}\left(E\left[X_iX_j\right]-E\left[X_i\right]E\left[X_j\right]\right) \\ &=& \frac1{n^2}\sum_i\left(E\left[X_i^2\right]-E\left[X_i\right]^2\right) \\ &=& \frac1{n^2}\sum_i\operatorname{Var}\left(X_i\right) \\ &=& \frac1n\operatorname{Var}\left(X_1\right)\;, \end{eqnarray}

where the step that turns the double sum into a single sum uses the fact that the mixed terms are just the covariances, which are zero for uncorrelated variables.

$\endgroup$
1
  • 1
    $\begingroup$ Great answer! This is exactly something I wasn't sure about - is the variance of the sample mean the same in both cases (independent vs. uncorrelated). But it is clear to me now. Thanks! $\endgroup$ Commented Feb 6, 2020 at 21:05

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.