4
$\begingroup$

In Williams' "Probability with Martingales" he defines a regular conditional probability given $\mathcal{G} \subset \mathcal{F}$ as the map $$ \mathbf{P}(\cdot, \cdot) : \Omega \times \mathcal{F} \to [0,1) $$ such that \begin{align*} &1.\quad \forall F \in \mathcal{F}, \omega \mapsto \mathbf{P}(\omega,F) \text{ is a version of } \mathbf{P}(F \mid \mathcal{G}),\\ &2.\quad\text{ for almost every } \omega \in \Omega, F \mapsto \mathbf{P}(\omega,F) \text{ is a probability measure on } (\Omega,\mathcal{F}). \end{align*} Property 1 tells us that for any $F \in \mathcal{F}$ and any $G \in \mathcal{G}$, \begin{align*} \int_G \mathbf{P}(\omega, F) \, \mathrm{d}P(\omega) & = \int_G \mathbf{P}(F \mid \mathcal{G})(\omega) \, \mathrm{d}P(\omega) \\ & = \int_G E(\mathbb{I}_F \mid \mathcal{G})(\omega) \, \mathrm{d}P(\omega) \\ & = \int_G \mathbb{I}_F(\omega) \, \mathrm{d}P(\omega) \\ & = P(F \cap G). \end{align*}

Fair enough, but I'm trying to get a handle on the regular conditional probability given a random variable $X: \Omega \to \mathbb{R}$. It seems like we should define it as the map $$ \mathbf{P}(\cdot, \cdot) : \mathbb{R} \times \mathcal{F} \to [0,1) $$ such that \begin{align*} &1.\quad \forall F \in \mathcal{F}, x \mapsto \mathbf{P}(x,F) \text{ is a version of } \mathbf{P}(F \mid \sigma(X)),\\ &2.\quad\text{ for almost every } x \in \mathbb{R}, F \mapsto \mathbf{P}(x,F) \text{ is a probability measure on } (\Omega, \mathcal{F}). \end{align*} However, I'm not sure how to get the "defining equation" as above, but here's my shot. Let $F \in \mathcal{F}$ and $C \in \sigma(X) \subset \mathcal{F}$. Note $C = X^{-1}(B)$ for some $B \in \mathcal{B}$. Letting $\Lambda_X$ be the distribution of $X$, by property 1 we get \begin{align*} \int_B \mathbf{P}(x,F) \, \mathrm{d}\Lambda_X(x) & = \int_C \mathbf{P}(X(\omega), F) \, \mathrm{d} P(\omega) \\ & = \int_C \mathbf{P}(F \mid \sigma(X))(\omega) \, \mathrm{d} P(\omega) \\ & = \int_C E(\mathbb{I}_F \mid \sigma(X))(\omega) \, \mathrm{d} P(\omega) \\ & = \int_C \mathbb{I}_F(\omega) \, \mathrm{d} P(\omega) \\ & = P(F \cap \{X \in B\}). \end{align*} Does this seem correct?

Update Based on Conrado Costa's answer, I think the proper definition for the regular conditional probability given $X$ should be as follows. Below, let $X: (\Omega, \mathcal{F}) \to (E, \mathcal{E})$.

Assume the regular conditional probability given $\sigma(X)$ exists. That is, there exists a function $\mathbf{P}^2 : \Omega \times \mathcal{F} \to [0,1]$ such that

  1. $\forall F \in \mathcal{F}$, $\omega \mapsto \mathbf{P}^2(\omega,F)$ is a version of $\mathbf{P}(F \mid \sigma({X}))$,
  2. $\forall \omega \in \Omega$, $F \mapsto \mathbf{P}^2(\omega, F)$ is a probability measure on $(\Omega, \mathcal{F})$.

In particular, property 1 tells us that $\forall F \in \mathcal{F}$, $\omega \mapsto \mathbf{P}^2(\omega,F)$ is $(\sigma(X), \mathcal{B}[0,1])$-measurable. Then, by Theorem 18 listed below in Conrado Costa's answer, there exists a function $\mathbf{P}^1: E \times \mathcal{F} \to [0,1]$ that is $(\mathcal{E}, \mathcal{B}[0,1])$-measurable such that $\forall F \in \mathcal{F}$, $\omega \mapsto \mathbf{P}^2(\omega,F) = \mathbf{P}^1(X(\omega),F)$.

So, for any $F \in \mathcal{F}$ and any $C \in \sigma(X)$, by the change of variables theorem we get \begin{align*} \int_{X(C)} \mathbf{P}^1(x,F) \, d\Lambda_X(x) & = \int_C \mathbf{P}^2(\omega,F) \, dP(\omega) \\ & = \int_C \mathbf{P}(F \mid \sigma(X))(\omega) \, dP(\omega) \\ & = \int_C E(\mathbb{I}_F \mid \sigma(X))(\omega) \, dP(\omega) \\ & = \int_C \mathbb{I}_F(\omega) \, dP(\omega) \\ & = P(F \cap \{X \in B\}). \end{align*}

$\endgroup$

1 Answer 1

3
$\begingroup$

This is a subtle question.

you wrote: \begin{align*} &1.\quad \forall F \in \mathcal{F}, x \mapsto \mathbf{P}(x,F) \text{ is a version of } \mathbf{P}(F \mid \sigma(X)),\\ &2.\quad\text{ for almost every } x \in \mathbb{R}, F \mapsto \mathbf{P}(x,F) \text{ is a probability measure on } (\Omega, \mathcal{F}). \end{align*}

for 1 It is important to note that the first $\mathbf{P}$ the one atributed to $\mathbf{P}(x,F)$ is a function from $\Bbb{R} \times \mathcal{F}$ into $[0,1]$, the second $\mathbf{P}$ (atributed to $\mathbf{P}(F \mid \sigma(X))$ is a function from $\Omega \times \mathcal{F}$ into $[0,1]$.

So Let's give them different names, call the first one $\mathbf{P}^1(x,F)$ and the second one $\mathbf{P}^2(\omega, F)$

The second one is readily defined from your previous consideration as you already know how to deal with regular conditional probability distributions.

The first one needs the following theorem: (taken from Meyers Probability and potentials, page 10)

enter image description here

It tells us that $\mathbf{P}^2(\omega,F) = P^1(X(\omega),F)$ Use this $\mathbf{P}^1(x,F)$ as your object and then you can safely say that

\begin{align*} \int_B \mathbf{P}^1(x,F) \, \mathrm{d}\Lambda_X(x) & = \int_C \mathbf{P}^1(X(\omega), F) \, \mathrm{d} P(\omega) = \int_C\mathbf{P}^2(\omega, F) \, \mathrm{d} P(\omega) \; (\ldots)\\ \end{align*}

$\endgroup$
6
  • $\begingroup$ Thanks for the great explanation. Do you have a good definition for the regular conditional probability given $X$, then? $\endgroup$ Commented Sep 15, 2015 at 23:54
  • $\begingroup$ :) The best definition I know is on the Book of Karatzas Brownian motion and Stochastoc Calculus on page 307. Should you need more detail just ask. $\endgroup$ Commented Sep 15, 2015 at 23:57
  • $\begingroup$ I updated by question based on your answer, would you mind seeing if my understanding is correct? $\endgroup$ Commented Sep 20, 2015 at 15:37
  • $\begingroup$ I.e., $\mathbf{P}^1$ is the $h$ in Theorem 18, which is guaranteed to exist as soon as $\mathbf{P}^2$ (which would be $g$ in the theorem) exists? $\endgroup$ Commented Sep 20, 2015 at 15:46
  • $\begingroup$ Your understanding is correct. $\endgroup$ Commented Sep 20, 2015 at 19:26

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.