--

**Multiple Continuous Random Variables:**--

Let X and Y be jointly continuous random variables with joint PDF $f_{X,Y}$

The joint, marginal and conditional PDFs are related as follow

$f_{X,Y}(x,y)$ = $f_Y(y)f_{X|Y}(x|y)$

$f_X(x)$ = $\int_{-\infty}^{\infty}f_{X,Y}(x,y)dy$ = $\int_{-\infty}^{\infty}f_Y(y)f_{X|Y}(x|y)dy$

conditional PDF $f_{X|Y}(x|y)$ is defined only for those y for which $f_Y(y)$ > 0

These PDFs can be used to calculate probabilities as follow:

$P((X,Y) \in B)$ = $\int \int_{(x,y) \in B} f_{X,Y}(x,y)dxdy$

$P(X \in A)$ = $\int_{x \in A} f_X(x)dx$

$(X \in A | y=y)$ = $\int_{x \in A} f_{X|Y}(x|y)dx$

They can be used to calculate Expectations:

E[g(X)] = $\int g(x)f_X(x)dx$

E[g(X,Y)] = $\int \int g(x,y)f_{X,Y}(x,y)dxdy$

E[g(X)|Y=y] = $\int g(x)f_{X|Y}(x|y)dx$

E[g(X,Y)|Y=y] = $\int g(x,y)f_{X|Y}(x|y)dx$

For any event A, we have following version of total probability theorem:

P(A) = $\int P(A | Y=y)f_Y(y)dy$

And, there are following versions of total expectation theorem:

E[X] = $\int E[X|Y=y]f_Y(y)dy$

E[g(X)] = $\int E[g(X)|Y=y]f_Y(y)dy$

E[g(X,Y)] = $\int E[g(X,Y)|Y=y]f_Y(y)dy$

**Independence:**

continuous random variables X and Y are independent iff

$f_{X,Y}(x,y)$ = $f_X(x)f_Y(y)$ for all x,y

and, following properties hold for independent X and Y

E[XY] = E[X]E[Y]

E[g(X)h(Y)] = E[g(X)]E[h(Y)]

var(X + Y) = var(X) + var(Y)

--

**Bayes' Rule for Continuous Random Variables**--

Let Y be a continuous random variable.

And, If X is a continuous random variable then

$f_{X|Y}(x|y)$ = $\frac{f_X(x)f_{Y|X}(y|x)}{f_Y(y)}$ = $\frac{f_X(x)f_{Y|X}(y|x)}{\int f_X(t)f_{Y|X}(y|t)dt}$

If A is an event, we have

P(A|Y=y)$f_Y(y)$ = P(A)$f_{Y|A}(y)$

So, P(A|Y=y) = $\frac{P(A)f_{Y|A}(y)}{P(A)f_{Y|A}(y|A) + P(A^c)f_{Y|A^c}(y)}$

If N is a discrete random variable, then we can choose A to be event N = n, we get

P(N=n|Y=y)$f_Y(y)$ = $p_N(n)f_{Y|N}(y|n)$

So, P(N=n|Y=y) = $\frac{p_N(n)f_{Y|N}(y|n)}{\sum_i p_N(i)f_{Y|N}(y|i)}$

--

**Derived Distributions:**

This describes how we calculate the PDF of Y, given that Y = g(X) and PDF of X

The approach is to calculate CDF of Y and then differentiate it.

$F_Y(y)$ = $P(g(X) \leq y)$ = $\int_{x|g(x) \leq y} f_X(x) dx$

and, $f_Y(y)$ = $\frac{dF_Y}{dy}(y)$

when Y = aX + b. Then

$f_Y(y)$ = $\frac{1}{|a|}f_X(\frac{y-b}{a})$

when Y = g(X) and g is *strictly* monotonic, then there always exists a function h s.t.

y = g(x) iff x = h(y)

Assuming h is differentiable, we have

$f_Y(y)$ = $f_X(h(y))|\frac{dh}{dy}(y)|$

## No comments:

## Post a Comment