Monday, September 20, 2010

Introduction to probability, ch#3(General Random Variables) notes - Part-1

Introduction to probability, ch#3(General Random Variables) notes - Part-1

A "continuous" random variable can take any value on the real number line unlike a discrete random variable.

PDF(Probability Density Function), counterpart of PMF of discrete random variables, of a continuous random variable X is a nonnegative function $f_X$ s.t.
P($X \in B$) = $\int_B f_X(x)dx$
for every subset of the real line. In particular, the probability that the value of X falls within an interval is
P($a \leq X \leq b$) = $\int_a^b f_X(x)dx$.

Note that, for a single value a, P(X = a) = $\int_a^a f_X(x)dx$ = 0. Hence, excluding or including the endpoints of any interval has no effect on its probability. That is
P($a \leq X \leq b$) = P($a < X < b$) = P($a \leq X < b$) = P($a < X \leq b$)

PDF also has to satisfy the normalization property, $\int_{-\infty}^{\infty} f_X(x)dx$ = 1

Also, note that PDF is not the probability of any particular event and hence not necessarily restricted to less than or equal to 1. It just has to be nonnegative and satisfy normalization property stated above.

-- Expectation and Variance: --
Let X be a continuous random variable with PDF $f_X(x)$

E[X] = $\int_{-\infty}^{\infty} xf_X(x)dx$
E[g(X)] = $\int_{-\infty}^{\infty} g(x)f_X(x)dx$

var(X) = E[(X - E[X])^2] = $\int_{-\infty}^{\infty} (x - E[X])^2f_X(x)dx$ = $E[X^2] - (E[X])^2$

If, Y = aX + b then
E[Y] = aE[X] + b
var(Y) = $a^2$var(X)


-- Cumulative Distribution Function --
CDF is a unified concept to deal with discrete as well as continuous random variables. CDF of X is denoted by $F_X$.
$F_X(x)$ = P($X \leq x$) = $\sum_{k \leq x} p_X(k)$ , if X is discrete
$F_X(x)$ = P($X \leq x$) = $\int_{-\infty}^{x} f_X(t)dt$, if X is continuous

Conceptually, $F_X(x)$ "accumulates" probability "upto" the value x.

If X is discrete, then we can obtain PMF of X from CDF of X as follow
$p_X(k)$ = $P(X \leq k)$ - $P(X \leq k-1)$ = $F_X(k)$ - $F_X(k-1)$, for all integers k

If X is continuous, then the PDF can be obtained by differentiating the CDF(assuming it has a derivative)
$f_X(x)$ = $\frac{dF_X}{dx}(x)$

-- Conditioning on an event --
Let A be some event, then conditional PDF of X, $f_{X|A}$ is defined as follow
$P(X \in B | A)$ = $\int_B f_{X|A}(x)dx$

in the special case when conditioning event is of the form $X \in A$, we have
$P(X \in B | X \in A)$ = $\frac{P(X \in B, X \in A)}{P(X \in A)}$ = $\frac{\int_{A \cap B}f_X(x)dx}{P(X \in A)}$

so, $\int_B f_{X|A}dx$ = $\frac{\int_{A \cap B}f_X(x)dx}{P(X \in A)}$

Hence, for events of the type $X \in A$
$f_{X|A}$ = $\frac{f_X(x)}{P(X \in A)}$, if $x \in A$ and 0 otherwise

A version of total probability theorem:
Let $A_1, A_2, ..., A_n$ form partition of the sample space with $P(A_i)$ for each i, then
$f_X(x)$ = $\sum_{i=1}^{n} P(A_i)f_{X|A_i}(x)$

Conditional Expectation
E[X|A] = $\int_{-\infty}^{\infty} xf_{X|A}(x)dx$
E[g(X)|A] = $\int_{-\infty}^{\infty} g(x)f_{X|A}(x)dx$

total expectation theorem:
E[X] = $\sum_{i=1}^{n} P(A_i)E[X|A_i]$
E[g(X)] = $\sum_{i=1}^{n} P(A_i)E[g(X)|A_i]$

No comments:

Post a Comment