3
$\begingroup$

Let $N_n$ - random variable with integer values and $\frac{N_n}{n} \xrightarrow[]{P} a$. Let $S_n = X_1 + \cdots + X_n$, where $X_i$ - iid with $EX_i=0$, $Var X_i = 1$. $N_n$ and $\{X_i\}$ are independent. Prove that $$\frac{S_{N_n}}{\sqrt{n}} \xrightarrow[]{d} U \sim \mathcal{N}(0,a).$$

I tried to repeat the last part of proof of CLT which is using characteristic function:

$$\psi_{\frac{X_1 +\cdots + X_{N_n}}{\sqrt n}} (t) = \psi_{\frac{X_1}{\sqrt n}} (t) \cdots \psi_{\frac{X_{N_n}}{\sqrt n}} (t) = \left(\psi_{\frac{X_1}{\sqrt n}} (t) \right)^{N_n} = \\ = \left( 1 - \frac{t^2}{2n} + O\left(\frac{1}{n}\right) \right)^{n \cdot\frac{N_n}{n}} \rightarrow \left(e^{\frac{t^2}{2}}\right)^a = e^{-\frac{at^2}{2}}.$$ Of course, the limit is not correct because $N_n$ - random. But this is kind of idea. And I have no ideas how to use this convergence in probability $\frac{N_n}{n} \xrightarrow[]{P} a$.

$\endgroup$
7
  • 2
    $\begingroup$ Are $N_n$ and $\{X_n\}$ independent? $\endgroup$ Commented Feb 18 at 22:42
  • 1
    $\begingroup$ @Zhanxiong yes, thanks for the question $\endgroup$ Commented Feb 19 at 7:34
  • $\begingroup$ I wonder whether the result is even true without some uniform integrability assumption. Even almost sure convergence of $N_n/n$ to 1 does not guarantee that variance converges to 1. We can just take $N_n = n$ with probability $1-1/n$ and $N_n$ is astronomically large with probability $1/n$. This would give a very high variance for $S_Nn$. $\endgroup$ Commented Feb 19 at 9:28
  • 2
    $\begingroup$ @JayanthRVarma No. Variance of limiting distribution and limit of variance are completely different things. Just think about your example, $N_{n}/n$ converges almost surely (hence in probability, hence weakly) to $1$. Does the large variance of $N_{n}/n$ block it converging to a constant, something with zero variance? $\endgroup$ Commented Feb 19 at 11:53
  • 1
    $\begingroup$ @Q9y5 You are right. $\endgroup$ Commented Feb 21 at 4:47

2 Answers 2

5
$\begingroup$

After a second thought, the conclusion can hold without the independence assumption between $N_n$ and $\{X_i\}$, and the proof goes as follows (the key is to apply Kolmogorov's inequality).

Without loss of generality, we can assume $a = 1$. Rewrite $\frac{S_{N_n}}{\sqrt{n}}$ as \begin{align*} \frac{S_{N_n}}{\sqrt{n}} = \frac{S_n}{\sqrt{n}} + \frac{S_{N_n} - S_n}{\sqrt{n}}. \tag{1}\label{1} \end{align*} By the classical CLT, the first term in the right hand side of $\eqref{1}$ converges in distribution to $N(0, 1)$. Therefore it suffices to show the remainder term (by the Slutsky's theorem) \begin{align*} \frac{S_{N_n} - S_n}{\sqrt{n}} \Rightarrow 0. \tag{2}\label{2} \end{align*} To prove $\eqref{2}$, notice that for any $\epsilon > 0$, \begin{align*} & P[|S_{N_n} - S_n| \geq \sqrt{n}\epsilon] \\ =& P[|S_{N_n} - S_n| \geq \sqrt{n}\epsilon, |N_n - n| \geq \epsilon^3n] + P[|S_{N_n} - S_n| \geq \sqrt{n}\epsilon, |N_n - n| < \epsilon^3n] \\ \leq & P[|N_n - n| \geq \epsilon^3n] + P\left[\max_{|k - n| < \epsilon^3n}|S_k - S_n| \geq \sqrt{n}\epsilon\right]. \tag{3}\label{3} \end{align*} The first term in the right hand side of $\eqref{3}$ converges to $0$ as $n \to \infty$ because $\frac{N_n}{n} \to_P 1$. To bound the second term, by Kolmogorov's inequality (or more precisely, repeat the proof of it), \begin{align*} P\left[\max_{|k - n| < \epsilon^3n}|S_k - S_n| \geq \sqrt{n}\epsilon\right] \leq \frac{1}{n\epsilon^2}\sum_{k: |k - n| < \epsilon^3n}\operatorname{Var}(X_k) \leq \frac{2n\epsilon^3}{n\epsilon^2} = 2\epsilon. \end{align*} This completes the proof.

$\endgroup$
6
  • $\begingroup$ Then we can conclude that if we have independence between $N_n$ and $X_i$ the statement can be proved as @Q9y5 suggested without Kolmogorov's inequality, BUT the statement is also true without assumption about independence, just need to use Kolmogorov's inequality. Am I right? $\endgroup$ Commented Feb 19 at 14:41
  • 1
    $\begingroup$ Nice, $S_{n}$ is a martingale, therefore $S_{n}/\sqrt{n}$ indeed implies a uniform continuity required for the convergence, upvoted. $\endgroup$ Commented Feb 19 at 14:49
  • 2
    $\begingroup$ @Alex Yes, but I cannot endorse that Q9y5's proof is rigorous enough. $\endgroup$ Commented Feb 19 at 14:50
  • $\begingroup$ @Zhanxiong one more question. It seems that it's not so obvious moment with reducing to the case $a=1$. I needed to represent $\frac{S_{N_n}}{\sqrt{n}}$ as $\sqrt{a} \frac{S_{N_n}}{\sqrt{\lfloor na\rfloor}} ( 1 + O(\frac{1}{\sqrt{n}} ))$. And then we can continue with $\frac{S_{N_n}}{\sqrt{\lfloor na\rfloor}}$. Does it looks like true? $\endgroup$ Commented Feb 20 at 21:30
  • 1
    $\begingroup$ @Alex Good question. A more straightforward decomposition is rewriting $\frac{S_{N_n}}{\sqrt{n}}$ as $\frac{S_{N_n}}{\sqrt{\lfloor na \rfloor}} \times \frac{\sqrt{\lfloor na \rfloor}}{\sqrt{n}}$ then use the $a = 1$ case result and the Slutsky's theorem (the second term converges to $\sqrt{a}$ as $n \to \infty$). $\endgroup$ Commented Feb 20 at 22:22
0
$\begingroup$

Update: Since $S_{n}$ is a partial sum process, the uniform continuity condition required in Aldous (1978) naturally holds using Doob's martingale inequality. Therefore, the result remains true even if $N_{n}$ and $S_{n}$ are not independent. See @Zhanxiong's post for a detailed proof.

However, independence is not completely useless: the convergence $S_{N_{n}}/\sqrt{N_{n}}\to_{\mathcal{L}|N_{n}}\mathcal{N}(0,1)$ together with $N_{n}/n\to_{\mathbb{P}}a$ yields a joint stable convergence in law of $(S_{N_{n}}/\sqrt{N_{n}},N_{n}/n)$. As a result, with independence, the desired convergence still holds even if $a$ is a random variable with $0<a<\infty$ a.s., and the limit would be $\mathcal{MN}(0,a)$, a mixed normal distribution.


For independent $N_{n}$, this is straightforward: $N_{n}\to\infty$ with probability approaching 1, conditional on this event, $S_{N_{n}}/\sqrt{N_{n}}\to_{\mathcal{L}|N_{n}}\mathcal{N}(0,1)$, since the limit is independent of $N_{n}$, it holds unconditionally. Applying Slutsky lemma together with $N_{n}/n\to_{\mathbb{P}} a$ gives desired result.

For general $N_{n}$ that is not independent of $S_{n}$, an additional (also necessary) uniform continuity condition is needed, see Aldous, David J (1978) "Weak convergence of randomly indexed sequences of random variables," Math. Proc. Camb. Phil. Soc, Vol. 83(1), 117-126.

$\endgroup$
11
  • 1
    $\begingroup$ could you please explain place where we claim that $S_{N_n}/\sqrt{N_n}$ converges to $\mathcal{N}(0,1)$ in distribution? The initial problem is that $N_n$ converges to $\infty$ in probability but in CLT we have ordinary convergence. I cannot understand that hack with conditional convergence. $\endgroup$ Commented Feb 19 at 21:32
  • $\begingroup$ Condition on $N_{n}$ means treating $N_{n}$ as nonrandom, hence your characteristic function method follows. By definition of convergence in dist, for any $\epsilon>0$, there exists a $N_{\epsilon}$ such that $|\mathbb{P}(S_{N_{n}}/\sqrt{N_{n}}\leq x\mid N_{n})-\Phi(x)|<\epsilon$ for all $N_{n}\geq N_{\epsilon}$. Law of iterated expectation $\mathbb{E}\bigl[\mathbb{E}[1_{A}\mid \mathcal{G}]\bigr]=\mathbb{E}[1_{A}]$ shows this inequality holds without conditioning as long as $N_{n}\geq N_{\epsilon}$, and you know by convergence in prob that $\mathbb{P}(N_{n}\geq N_{\epsilon})\to 1$. $\endgroup$ Commented Feb 20 at 5:20
  • 1
    $\begingroup$ Ok, put it this way: $A$ is the event $S_{N_{n}}/\sqrt{N_{n}}\leq x$, $B$ is the event $N_{n}>N_{\epsilon}$. You can have $|\mathbb{P}(A\mid B)-\Phi(x)|<\epsilon/2$, want to show $|\mathbb{P}(A)-\Phi(x)|<\epsilon$ for all $n$ greater than some $n_{\epsilon}$. By law of total probability $\mathbb{P}(A)-\Phi(x)=\mathbb{P}(A\mid B)\mathbb{P}(B)-\Phi(x)\mathbb{P}(B)+\mathbb{P}(A\mid B^{\complement})\mathbb{P}(B^{\complement})-\Phi(x)\mathbb{P}(B^{\complement})<\epsilon/2+\mathbb{P}(B^{\complement})$. $\endgroup$ Commented Feb 21 at 1:26
  • 1
    $\begingroup$ The probability of $B^{\complement}=\{N_{n}<N_{\epsilon}\}$ can be bounded by $\epsilon/2$ for $n$ sufficiently large. Similar for the other side. $\endgroup$ Commented Feb 21 at 1:29
  • 1
    $\begingroup$ Seems like much more clear, thank you! I would add indices $A_n$ and $B_n$, but it's not essential thing obviously. So we have convergence in distribution for $S_{N_n}/\sqrt{N_n}$ to $\mathcal{N}(0,1)$ (without any condition) and then it remains to apply Slutsky's lemma. $\endgroup$ Commented Feb 21 at 11:28

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.