You can estimate the mean using any identically distributed random variables, they don’t even have to be uncorrelated. By linearity of expectation (which relies neither on independence, nor on absence of correlation), the expectation of $S_n=\frac1n\sum_{i=1}^nX_i$ is the mean of the distribution, and thus $S_n$ is an unbiased estimator of the mean.
The advantage in having independent variables is that it makes the variance of the estimator lower. In the extreme case were all $X_i$ are in fact identical, we have $S_n=X_1$, so the variance of $S_n$ is just the variance of the distribution, with no improvement from the $n$ samples (unsurprisingly). On the other extreme, if the $X_i$ are independent, the variance of $S_n$ is only $\frac1n$ times the variance of the distribution.
This variance reduction only relies on the lack of correlation and is the same in your case, when the variables are uncorrelated, without the assumption of independence:
\begin{eqnarray} \operatorname{Var}\left(\frac1n\sum X_i\right) &=& E\left[\left(\frac1n\sum X_i\right)^2\right]-E\left[\frac1n\sum X_i\right]^2 \\ &=& \frac1{n^2}\sum_{i,j}\left(E\left[X_iX_j\right]-E\left[X_i\right]E\left[X_j\right]\right) \\ &=& \frac1{n^2}\sum_i\left(E\left[X_i^2\right]-E\left[X_i\right]^2\right) \\ &=& \frac1{n^2}\sum_i\operatorname{Var}\left(X_i\right) \\ &=& \frac1n\operatorname{Var}\left(X_1\right)\;, \end{eqnarray}
where the step that turns the double sum into a single sum uses the fact that the mixed terms are just the covariances, which are zero for uncorrelated variables.