Skip to main content

Questions tagged [random-variables]

Questions about maps from a probability space to a measure space which are measurable.

2 votes
1 answer
46 views

Suppose we have the sum of random, independent variables $$ Z_{ij} = X_i + Y_{ij}, $$ where $X_i \sim \text{Uniform}(-d, d)$ and $Y_{ij} \sim \text{Normal}(\mu, \sigma)$. Given that only one sample of ...
Roy Smart's user avatar
  • 177
0 votes
0 answers
75 views

I have heard that the original proof of Chebyshev's inequality did not utilize Markov's inequality. I am interested: To know whether this is true/accurate? Any other alternative proof for Chebyshev's ...
Eric_'s user avatar
  • 1,067
3 votes
1 answer
356 views

Recently, I read some materials on tightness of random variables and probability measures. There are two definitions: Definition 1. A sequence of random variables $(X_n)_{n \ge 1}$ is tight if for ...
FactorY's user avatar
  • 826
3 votes
2 answers
288 views

I've noticed that there is a strange unexplained thing about Pearson correlation coefficient $$\rho(X,Y) = \frac{\operatorname{Cov} (X,Y)}{\sqrt {\operatorname{Var}X} \sqrt {\operatorname{Var}Y} }$$ ...
Alex's user avatar
  • 163
2 votes
1 answer
44 views

Let $(\Omega, \mathcal{A}, \mathbb{P})$ be a probability space and let $(\mathcal{X}, \mathcal{F})$ be a measurable space and $X:(\Omega, \mathcal{A})\rightarrow (\mathcal{X}, \mathcal{F})$ a random ...
guest1's user avatar
  • 776
6 votes
4 answers
405 views

Let $X,Y$ be two i.i.d. I am trying to bound the expectation of how afar from one another they can get? That is, $E[|X-Y|]$. I know that: $$ E[X-Y] = E[X]- E[Y] = 0$$ But what about $|X-Y|$? One ...
Eric_'s user avatar
  • 1,067
3 votes
1 answer
80 views

I am working on the following exercise. Let $$X_1 \sim \mathrm{Exp}\left(\tfrac12\right), \qquad X_2 \sim \mathrm{Exp}\left(\tfrac12\right),$$ independent. Define $$Y_1 = X_1 + 2X_2, \qquad Y_2 = 2X_1 ...
Pizza's user avatar
  • 377
2 votes
2 answers
131 views

Having a bit of trouble with the definitions for convergence in probability and convergence in distribution for random variables. The textbook (Degroot) defines each as follows: Convergence in ...
miscast's user avatar
  • 21
0 votes
1 answer
68 views

In the following,we assume that two-dimensional discrete random variables $\vec{X}=[X_1,X_2]$ on $\mathbb{R} ^2$,and the range of values for both $X1$ and $X2$ is countably infinite,and they are ...
user1405622's user avatar
5 votes
1 answer
296 views

Let $X$ be a real-valued random variable, and define its moment generating function (MGF) as $$ M_X(s) = \mathbb{E}[e^{sX}], $$ where $\mathbb{E}[\cdot]$ denotes the expected value of the random ...
Anne's user avatar
  • 191
3 votes
1 answer
109 views

I am trying to rigorously derive the diffusion equation, given by $$ \frac{\partial u}{\partial t} = D\,\frac{\partial^2 u}{\partial x^2}, \qquad D = \frac{h^2}{2\tau}. $$ from a simple one-...
sam wolfe's user avatar
  • 3,585
0 votes
0 answers
60 views

This question may be a little trivial, but I was wondering if we can construct a bivariate (or multivariate) probability distribution function in a way that we have a mix of a singular and an ...
Lucas's user avatar
  • 41
6 votes
1 answer
624 views

I start with \$1. After one iteration of a game, one of the following $m$ outcomes occurs: With probability $p_1$, my wealth multiplies by $r_1$; With probability $p_2$, my wealth multiplies by $r_2$;...
Andrés Mejía's user avatar
0 votes
1 answer
64 views

Is this conjecture correct? If not, can it be modified to a correct one: Let $X,Y$ be continuous RVs with joint PDF $f(x,y)$. Then $X,Y$ are independent iff there exists functions $g, h$ such that $$...
SRobertJames's user avatar
  • 6,261
1 vote
1 answer
123 views

I'm not too familiar with random matrix theory so I cannot find a suitable reference for this question. Consider a set of matrices $\{A_i\}_{i=1}^k\subseteq M_{d\times d}$ over the complex field and ...
Another User's user avatar

15 30 50 per page
1
2 3 4 5
838