Questions tagged [probability-limit-theorems]
For question about limit theorems of probability theory, like the law of large numbers, central limit theorem or the law of iterated logarithm.
1,647 questions
6 votes
1 answer
206 views
Product of expectation at most 1 iid random variables
Suppose that $X_i$ are iid positive random variables, with $\mathbb{E}(X_i) \leq 1$, and $\operatorname{Var}(X_i) > 0$. Let $P_n = \prod^n_{i = 1} X_i,\:$ I want to show that almost surely $\lim_{n ...
2 votes
2 answers
131 views
Intuitive Explanation for Convergence in Probability and Convergence in Distribution
Having a bit of trouble with the definitions for convergence in probability and convergence in distribution for random variables. The textbook (Degroot) defines each as follows: Convergence in ...
4 votes
2 answers
205 views
Weak Law of Large Numbers for Uncorrelated and Bounded in $L^p$ for $p \in [1,2)$
Let $\{X_n\}_{n=1}^{\infty}$ be a sequence of square integrable real random variables defined on a probability space $(\Omega, F, P)$. Assume that the sequence is uncorrelated: $\mathbb{E}[(X_i-\...
0 votes
0 answers
55 views
$ \lim_{n \rightarrow \infty} \sum_{j=n}^{\infty} PA_j = 0$ where $A_j$ are disjoint sets.
While going through a proof in probability theory, I have got stuck at a point where $\{A_j\}$ are disjoint events in the sample space $ S$. And by definition of probability measure, $PA_j \geq 0, \...
0 votes
0 answers
41 views
Allowing sign changes in the normalizing constants in the Fisher–Tippett–Gnedenko theorem
Let $(X_i)_{i \ge 1}$ be i.i.d. random variables with distribution function $F$, and define $M_n = \max(X_1, \ldots, X_n)$. The classical Fisher–Tippett–Gnedenko theorem states that if there exist ...
5 votes
3 answers
184 views
Compute die roll cumulative sum hitting probabilities without renewal theory
Suppose we roll a fair die repeatedly and record the cumulative sum. The probability that the cumulative sum ever hits $k$ asymptotes to $u(k) \approx 1/μ=2/7$ for $k\rightarrow\infty$ from renewal ...
2 votes
2 answers
131 views
$N$ boxes with 2 balls in each, pick a ball randomly from a nonempty box each time, what is the expectation of 2-balled boxes after picking $N$ times?
I recently have encountered the following probability problem: Suppose there are $N$ boxes and each box contains precisely 2 balls. Each time, we pick one ball randomly from those nonempty boxes, ...
8 votes
2 answers
172 views
Lower limit of the means of independent random variables
Let $X_1,X_2,\cdots$ be a series of independent random variables with $$\mathbb{P}(X_n=-1)=\frac{1}{2},\quad \mathbb{P}(X_n=4n) = \frac{1}{8n},\quad \mathbb{P}(X_n=0)=\frac{1}{2}-\frac{1}{8n}$$ Let $...
2 votes
0 answers
128 views
Central limit theorem (Berry-Esseen theorem) for sum of a random number of random variables - from centred to non-centered variables?
The following Berry-Esseen theorem was obtained by Stein's method: Theorem (Chaidee and Keammanee, 2008, Theorem 2.1). Let $X_1, X_2, \dots$ be independent, identically-distributed random variables ...
1 vote
0 answers
25 views
Conditions for convergence in distribution for a sum of random variables
I have an $N \times N$ matrix $Q$ of independent Bernoulli-weighted exponential random variables with means $0\leq\lambda_{ij} < \infty$, $$ Q_{ij} \sim \textrm{Bernoulli}(\mu_{ij}) \times \textrm{...
0 votes
1 answer
42 views
GMM parameter estimation (Inverse problem of the EM algorithm)
We have a GMM model $\sum_{i=1}^N \alpha_k \mathcal{N}(x; \mu_k, \sigma_k^2I)$. Assuming that GMM is a single gaussian distribution $\mathcal{N}(x; \mu_\theta, \sigma_\theta^2I)$. It means that we ...
-1 votes
1 answer
39 views
A Seeming contradiction in definition of a transient state in Markov chain
My book says that by definition of a transient state, i, in Markov chain, Let (the first return time) $T_i=\inf\{n \ge 1: X_n=i \mid X_0=i \},\space\space then $ $\space P(T_i < \infty \mid X_0=i) &...
3 votes
1 answer
99 views
Central limit theorem for dependent Bernoullis on regular graphs
I am trying to determine whether a set of (slightly) dependent negatively correlated Bernoulli random variables satisfy a Central Limit Theorem (CLT). Let $\mathcal{G}_{reg}$ denote the uniform ...
1 vote
1 answer
84 views
Limiting distribution for a sum of fat-tailed (power law) random variables?
Takács (1959, Example 2) states: Let $\{ \xi_n \}$ be a sequence of independent and identically distributed positive random variables, with $ \xi_0 = 0 $. Let $\zeta_n = \xi_0 + \xi_1 + \dots + \xi_n$...
1 vote
1 answer
58 views
Integral condition for tightness of probability measures.
In the book Gradient flow by Ambrosio, Gigli and Savare. There is a useful condition (Remark $5.1.5$, an integral condition for tightness). More precise, to check a set $K\subseteq \mathbb{P}(X)$ is ...