Daily Archives: June 9, 2008

The Bernoulli Shortcut

If X has a Standard Bernoulli Distribution, then it can only have values 0 or 1 with probabilities q and 1-q.  Any random variables that can only have 2 values is a scaled and translated version of the standard bernoulli distribution.

Expected Value and Variance:

For a standard bernoulli distribution, E[X] = q and Var(X) = q(1-q).  If Y is a random variable that can only have values a and b with probabilities q and (1-q) respectively, then

\begin{array}{rl} Y &= (a-b)X +b \\ E[Y] &= (a-b)E[X] +b \\ Var(Y) &= (a-b)^2Var(X) \\ &= (a-b)^2q(1-q) \end{array}

 

Advertisements

Leave a comment

Filed under Probability

Normal Approximation

If a random variable Y is normal, you can map it to a standard normal distribution X (useful for finding probabilities in the standard normal table) by the following relationship:

Y = \mu_y + \sigma_yX

Example 1:  Y is normal.  E[Y] = 100 and Var(Y) = 49  Then

\begin{array}{rl} P(Y \leq 111.515) &= P(X \leq \frac{111.515 - 100}{\sqrt{49}}) \\ &= P(X \leq 1.645) \\ &= 0.95 \end{array}

Example 2:  Y has the same distribution as example 1.  Then P(Y \leq y) = 0.9 implies 

P(X \leq \frac{y - 100}{\sqrt{49}}) = 0.9

Which implies:

\frac{y - 100}{\sqrt{49}} = 0.8159

Hence y = 105.7113.

With regard to Central Limit Theorem:

By the Central Limit Theorem, the distribution of a sum of iid random variables converges to a normal distribution as the number of iid random variables increases.  This means that if the number of iid random variables is sufficiently large, we can get approximate probabilities by using a normal distribution approximation.

 

Leave a comment

Filed under Probability