1
$\begingroup$

Let $Y_i$ be a normal random variable given by noisy measurements of a normal random variable $X$ with have zero mean and variance $\sigma_X^2$ such that

$$Y_i = X + W_i$$

The noise $W_i$ are mutually independent, independent of $X$, and zero mean with variance $\sigma_W^2$. I am trying to find the minimum mean square error estimate of $X$ and the maximum a posterior estimate of $X$, but my question is how do I calculate the conditional density or joint density of $X$ and $Y$ since they are not independent. The MMSE and MAP will be straightforward if I can properly calculate the conditional and joint densities.

If $X$ and $Y$ were independent, I could simply calculate $f_{x,y}(x,y)=f_x(x)f_y(y)$ then calculate $f_{x|y}(x|y) = \frac{f_{x,y}(x,y)}{f_y(y)}$.

Or, if I could calculate $f_{y|x}(y|x)$, then I could simply calculate $f_{x|y}(x|y) = \frac{f_{y|x}(y|x)f_x(x)}{f_y(y)}$.

$\endgroup$

1 Answer 1

2
$\begingroup$

Noting that if $\mathbf{X}$ and $\mathbf{W}$ are independent then $f_{w|x} = f_w$. Also $f(y|x) = f_{w|x}(y-x) = f_{w}(y-x)$.

$\endgroup$
3
  • $\begingroup$ Why does $f_{y|x}(y|x) = f_{w|x}(y-x)$? $\endgroup$ Commented Nov 10, 2016 at 18:40
  • $\begingroup$ Please explain, I don't understand why these are equivalent. $\endgroup$ Commented Nov 10, 2016 at 19:08
  • 1
    $\begingroup$ It follows from the fact that $f_{y,x}(y,x) = f_{w,x}(y-x,x)$. For details I would recommend Probability, Random Variables and Stochastic Process by Athanasios Papoulis and S.U. Pillai. Specifically, have a look at example 6-24 of the fourth edition. $\endgroup$ Commented Nov 10, 2016 at 21:24

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.