Tag Archives: Law of Total Probability

Conditional Probability and Expectation

 

Conditional probability:

\Pr(X\mid Y) = \displaystyle \frac{\Pr(X \cap Y)}{\Pr(Y)}

Bayes Theorem:

\Pr(A\mid B) = \displaystyle \frac{\Pr(B \mid A)\Pr(A)}{\Pr(B)}

for continuous distributions:

f_X(x\mid y) = \displaystyle \frac{f_Y(y \mid x)f_X(x)}{f_Y(y)}

Recall for a joint distribution function f(x,y),

f_X(x) = \displaystyle \int_{-\infty}^\infty {f(x,y)dy}

Law of Total Probability:  Suppose \displaystyle \sum_{i=1}^n B_i = 1 and \Pr(B_i \cap B_j) = 0 for i \ne j, then for any event A,

\begin{array}{rl} \Pr(A) &= \displaystyle \sum_{i=1}^n \Pr(A \cap B_i) \\ &= \displaystyle \sum_{i=1}^n \Pr(B_i)\Pr(A\mid B_i) \end{array}

In many cases, you will need to use the law of total probability in conjunction with Bayes Theorem to find P(A) or P(B).

For a continuous distribution:

\Pr(A) = \displaystyle \int\Pr(A\mid x)f(x)dx

Conditional Mean:

E_X[X] = E_Y[E_X[X\mid Y]]

Advertisements

Leave a comment

Filed under Probability