**Linearity of Expected Value:** Suppose and are random variables and and are scalars. The following relationships hold:

**Variance:**

Suppose for are independent identically distributed (iid) random variables. Then for and

**Example:**

is the stock price of AAPL at market close. is the sum of closing AAPL stock prices for 5 days. Then

.

Contrast this with the variance of . In other words, is a random variable that takes a value of 5 times the price of AAPL at the close of any given day. Then

The distinction between and is subtle but very important.

**Variance of a Sample Mean:**

In situations where the sample mean is a random variable over iid observations (i.e. the average price of AAPL over 5 days), the following formula applies:

### Like this:

Like Loading...

*Related*

I understand that E(XY) = E(X)E(Y) only if X and Y are independent. Suppose you had 3 independant random variables X,Y, and Z and you wanted E[X(Y+Z)]. Does this obey a distributive law, ie does E[X(Y+Z)] = E(X)E(Y)+E(X)E(Z)?

If so, what if it you wanted:

E{(aX+b)[(cY+d)+(mZ+n)]} ,

could you just multiply through like in regular algebra?

Yes, you can distribute inside the brackets to get E[XY + XZ]. Then by linearity, this is equal to E[XY] + E[XZ]. If X and Y are independent, then E[XY] = E[X]E[Y]. And similarly if X and Z are independent, E[XZ] = E[X]E[Z].