Sunday, September 19, 2010

Introduction to probability, ch#2(Discrete Random Variables) notes

Introduction to probability, ch#2(Discrete Random Variables) notes

A Random variable is a real-valued function of the outcome of the experiment. Its called discrete if range of that function is finite or countably infinite.
A discrete rancom variable has an associated PMF(probability mass function), which gives the probability of each numerical value that the random variable can take. For example, if x is any possible value of a random variable X, the probability mass of X, denoted by $p_X(x)$, is the probability of the event {X = x} consisting of all the outcomes that give rise to a value of X = x, so
$p_X(x)$ = P({X = x})
Note that, as per the normalization probability axiom, $\sum_{x}p_X(x)$ = 1. Also for any given S, we have P($X \in S$) = $\sum_{x \in S}p_X(x)$

-- Expectation/Mean --
E[X] = $\sum_{x}xp_X(x)$
and for any function g(X) of X, E[g(X)] = $\sum_{x}g(x)p_X(x)$

-- Variance --
var(X) = E[$(X-E[X])^2$] = E[$X^2$] - ${(E[X])}^2$
Standard Deviation, $\sigma_X$ = $\sqrt{var(X)}$

-- nth moment of X --
nth moment of X = E[$X^n$]

Mean and Variance of a Linear Function of a Random Variable:
Let X and Y be two random variables s.t. Y = aX + b, where a and b are given scalars. Then
E[Y] = aE[X] + b
var(Y) = $a^2$var(X)

-- Joint PMFs: --
Let X and Y be random variables associated with the same experiment.

The joing PMF $p_{X,Y}$ of X and Y is defined by
$p_{X,Y}(x,y)$ = P(X = x, Y = y)

The marginal PMFs of X and Y respectively are
$p_X(x)$ = $\sum_{y}p_{X,Y}(x,y)$
$p_Y(y)$ = $\sum_{x}p_{X,Y}(x,y)$

E[g(X,Y)] = $\sum_{x}\sum_{y}g(x,y)p_{X,Y}(x,y)$
In case, g(X,Y) is linear. Then E[aX + bY + c] = aE[X] + bE[Y] + c

-- Conditional PMFs: --
The conditional PMF of X given an even A with P(A) > 0, is defined by
$p_{X|A}$(x) = P(X=x|A)
It follows all the probability axioms including normalization, that is $\sum_{x}p_{X|A}(x)$ = 1

The conditional PMF of X given another Random Variable Y = y is related to the joint PMF by
$p_{X,Y}(x,y)$ = $p_Y(y)p_{X|Y}(x|y)$

and this can be used to calculate marginal PMF of X(similar total probability theorem)
$p_X(x)$ = $\sum_{y}p_Y(y)p_{X|Y}(x|y)$

-- Conditional Expectation: --
Conditional expectation of X given an event A with positive probability is defined by
E[X|A] = $\sum_{x}xp_{X|A}(x)$
and for a function, g(X) we have
E[g(X)|A] = $\sum_{x}g(x)p_{X|A}(x)$

The conditional expectation of X given a value y of Y is defined by
E[X|Y=y] = $\sum_{x}xp_{X|Y=y}(x)$

And this leads to the total expectation theorem
E[X] = $\sum_{y}p_Y(y)E[X|y=y]$

Let $A_1, A_2,... ,A_n$ are positive probability evens such that they form a partition of the sample space. Then
E[X] = $\sum_{i=1}^{n}P(A_i)E[X|A_i]$

Let $A_1, A_2,... ,A_n$ be partition of an even B and $P(A_i \cap B)$ > 0 for all i. Then
E[X|B] = $\sum_{i=1}^{n}P(A_i|B)E[X|A_i \cap B]$


-- Independent Random Variables: --
Let A be an event, with P(A) > 0, and let X and Y be random variables associated with the same experiment. Then

X is independent of event A if, $p_{X|A}(x)$ = $p_X(x)$ for all x.

X and Y are independent. Then
$p_{X,Y}(x,y)$ = $p_X(x)p_Y(y)$ for all x,y
E[XY] = E[X]E[Y]
E[g(X)h(Y)] = E[g(X)]E[h(Y)]
var(X+Y) = var(X) + var(Y)

No comments:

Post a Comment