# Tag Archives: Frequency

## Approximating Aggregate Losses

An aggregate loss $S$ is the sum of all losses in a certain period of time.  There are an unknown number $N$ of losses that may occur and each loss is an unknown amount $X$.  $N$ is called the frequency random variable and $X$ is called the severity.  This situation can be modeled using a compound distribution of $N$ and $X$.  The model is specified by:

$\displaystyle S = \sum_{n=1}^N X_n$

where $N$ is the random variable for frequency and the $X_n$‘s are IID random variables for severity.  This type of structure is called a collective risk model.

An alternative way to model aggregate loss is to model each risk using a different distribution appropriate to that risk.  For example, in a portfolio of risks, one may be modeled using a pareto distribution and another may be modeled with an exponential distribution.  The expected aggregate loss would be the sum of the individual expected losses.  This is called an individual risk model and is given by:

$\displaystyle S = \sum_{i=1}^n X_i$

where $n$ is the number of individual risks in the portfolio and the $X_i$‘s are random variables for the individual losses.  The $X_i$‘s are NOT IID, and $n$ is known.

Both of these models are tested in the exam; however, the individual risk model is usually tested in combination with the collective risk model.  An example of a problem structure that combines the two is given below.

Example 1: Your company sells car insurance policies.  The in-force policies are categorized into high-risk and low-risk groups.  In the high-risk group, the number of claims in a year is poisson with a mean of 30.  The number of claims for the low-risk group is poisson with a mean of 10.  The amount of each claim is pareto distributed with $\theta = 200$ and $\alpha = 2$.
Analysis: Being able to see the structure of the problem is a very important first step in being able to solve it.  In this situation, you would model the aggregate loss as an individual risk model.  There are 2 individual risks– high and low risk.  For each group, you would model the aggregate loss using a collective risk model.  For the high-risk, the frequency is poisson with mean 30 and the severity is pareto with $\theta = 200$ and $\alpha = 2$.  For the low-risk group, the frequency is poisson with mean 10 and the severity is pareto with the same parameters.

For these problems, you will need to know how to:

1. Find the expected aggregate loss.
2. Find the variance of aggregate loss.
3. Approximate the probability that the aggregate loss will be above or below a certain amount using a normal distribution.
Example: what is the probability that aggregate losses are below $5,000? 4. Determine how many risks would need to be in a portfolio for the probability of aggregate loss to reach a given level of certainty for a given amount. Example: how many policies should you underwrite so that the aggregate loss is less than the expected aggregate loss with a 95% degree of certainty? 5. Determine how long your risk exposure should be for the probability of aggregate loss to reach a given level of certainty for a given amount. Problems that require you to determine probabilities for the aggregate loss will usually state that you should use a normal approximation. This will require the calculation of the expected aggregate loss and the variance of the aggregate loss. MEMORIZE Expected aggregate loss for a collective risk model is given by: $E[S] = E[N]E[X]$ For the individual risk model, it is $\displaystyle E[S] = \sum_{i=1}^n E[X_i]$ Variances under the collective risk model are conditional variances. $Var(S) = E[Var(X|I)] + Var(E[X|I])$ When frequency and severity are independent, the following shortcut is valid and is called a compound variance: $Var(S) = E[N]Var(X) + Var(N)E[X]^2$ Variance under the individual risk model is additive: $\displaystyle Var(S) = \sum_{i=1}^n Var(X)$ Example 2: Continuing from Example 1, calculate the mean and variance of the aggregate loss. Assume frequency and severity are independent. Answer: This is done by 1. Calculating the expected aggregate loss and variance in the high-risk group. 2. Calculating the expected aggregate loss and variance in the low-risk group. 3. Adding the expected values from both groups to get the total expected aggregate loss. 4. Adding the variances from both groups to get the total variance. I will use subscript $H$ and $L$ to denote high and low risk groups respectively. $E[S_H] = E[N_H]E[X_H] = 30\times 200 = 6,000$ $\begin{array}{rll} Var(S_H) &=& E[N_H]Var(X_H) + Var(N_H)E[X_H]^2 \\ &=& 30 \times 40,000 + 30 \times 200^2 \\ &=& 2,400,000 \end{array}$ $E[S_L] = E[N_L]E[X_L] = 10 \times 200 = 2,000$ $\begin{array}{rll} Var(S_H) &=& 10 \times 40,000 + 10 \times 200^2 \\ &=& 800,000 \end{array}$ Add expected values to get $E[S] = 6,000 + 2,000 = 8,000$ Add variances to get $Var(S) = 2,400,000 + 800,000 = 3,200,000$ Once the mean and variance of the aggregate loss has been calculated, you can use them to approximate probabilities for aggregate losses using a normal distribution. Example 3: Continuing from Example 2, use a normal approximation for aggregate loss to calculate the probability that losses exceed$12,000.
Answer:  To solve this, you will need to calculate a $z$ value for the normal distribution using the expected value and variance found in Example 2.

$\begin{array}{rll} \Pr(S > 12,000) &=& 1- \Pr(S< 12,000) \\ \\ &=& \displaystyle 1-\Phi\left(\frac{12,000 - 8,000}{\sqrt{3,200,000}}\right) \\ \\ &=& 1 - \Phi(2.24) \\ \\ &=& 0.0125 \end{array}$

CONTINUITY CORRECTION
Suppose in the above examples the severity $X$ is discrete.  For example, $X$ is poisson.  Under this specification, we need to add 0.5 to 12,000 in the calculation for $\Pr(S > 12,000)$.  So we would instead calculate $\Pr(S > 12,000.5)$  This is called a continuity correction and occurs when we have a discrete severity random variable.  If we were interested in $\Pr(S<12,000)$, we would subtract 0.5 instead.  This has a greater effect when the domain of possible values is smaller.

Another type of problem I’ve encountered in the samples is constructed as follows:

Example 4: You drive a 1992 Honda Prelude Si piece-of-crap-mobile (no, that’s my old car and you are driving it because I sold it to you to buy my Mercedes).  The failure rate per year is poisson with mean 2.  The average cost of repair for each instance of breakdown is $500 with a standard deviation of$1000.  How many years do you have to continue driving the car so that the probability of the total maintenance cost exceeding 120% of the expected total maintenance cost is less than 10%?  (Assume the car is so crappy that it cannot deteriorate any further so the failure rates and average repair costs remain constant every year.)

$E[S_1] = 1,000$

$\begin{array}{rll} Var(S_1) &=& 2 \times 1,000^2 + 2 \times 500^2 \\ &=& 2,500,000 \end{array}$

For $n$ years, we have

$E[S] = 1,000n$

$Var(S) = 2,500,000n$

According to the problem, we are interested in $S$ such that $\Pr(S > 1,200n) = 0.1$.  Under normal approximation, this implies

$\begin{array}{rll} \Pr(S>1,200n) &=& 1-\Pr(S<1,200n) \\ \\ &=& \displaystyle 1- \Phi\left(\frac{1,200n - 1,000n}{\sqrt{2,500,000n}}\right) \end{array}$

Which implies

$\displaystyle \Phi\left(\frac{200n}{\sqrt{2,500,000n}}\right) = 0.9$

The probability $0.9$ corresponds to a $z$ value of 1.28.  This implies

$\displaystyle \frac{200n}{\sqrt{2,500,000n}} = 1.28$

Solving for $n$ we have $n = 1024$ years.  LOL!

## Frequency with Respect to Exposure and Coverage Modifications

In a portfolio of risks, there are two types of modifications which can influence the frequency distribution of payments.

1. Exposure Modification (not in syllabus) — increasing or decreasing the number of risks or time periods of coverage in the portfolio
2. Coverage Modification — applying limits, deductible or adjusting for inflation in each individual risk
EXPOSURE MODIFICATION
If there is an exposure modification, you would adjust the frequency distribution by scaling the appropriate parameter to reflect the change in exposure.  The following list provides the appropriate parameter to adjust for each distribution:
1. Poisson:  $\lambda$
2. Negative Binomial:  $r$
(Geometric is a Negative Binomial with $r = 1$)
3. Binomial:  $m$
(Only valid if the new value remains an integer)
Example 1:  You own a portfolio of 10 risks.  You model the frequency of claims with a negative binomial having parameters $r = 2$ and $\beta = 0.5$.  The number of risks in your portfolio increases to 15.  What are the parameters for the new distribution?
Answer:  The frequency distribution now has parameters $r = 3$ and $\beta = 0.5$  Note that since the mean and variance are $r\beta$ and $r\beta(1+\beta)$ respectively, the new mean and variance are multiplied by 1.5.
COVERAGE MODIFICATION
Coverage modifications shift, censor, or scale the individual risks, usually in the presence of a deductible or claim limit, and they change the conditions that trigger a payment.  For example, adding a deductible $d$ is considered a coverage modification and this changes the condition for payment because any losses below the deductible do not qualify for payment.  If the risk is represented by random variable $X$, then adding a deductible would change the random variable to $(X-d)_+$. Scaling in the presence of a deductible or claim limit also affects the frequency distribution.  The following lists parameters affected by coverage modifications:
1. Poisson:  $\lambda$
2. Negative Binomial:  $\beta$
3. Binomial:  $q$
4. Geometric:  $\beta$
These parameters are scaled by the probability that a payment occurs.
Example 2:  The frequency of loss is modeled as a Poisson distribution with parameter $\lambda = 5$.  A deductible is imposed so that only 80% of losses result in payments.  What is the new distribution?
Answer:  It is Poisson with $\lambda = 0.8(5) = 4$.
Example 3:  The frequency of payment $N$ is modeled as a negative binomial with parameters $r = 3$ and $\beta = 0.5$.  Losses $X$ are pareto distributed with parameters $\alpha = 2$ and $\theta = 100$.  The deductible is changed from $d=10$ to $d=15$.  What are the new parameters in the frequency distribution?
Answer:  Firstly, $N$ is the frequency of payment.  So it reflects the current deductible.  If you wanted the distribution of $N$ without deductible, you would divide $\beta$ by $\Pr{(X>10)}$.  Now to find the distribution of $N$ with the deductible of 15, you multiply $\beta$ by $Pr(X>15)$.  To summarize:
$\begin{array}{rll} \beta_{new} &=& \displaystyle \beta_{old}\times\frac{\Pr(X>15)}{\Pr(X>10)} \\ \\ &=& \displaystyle 0.5\times \frac{0.75614367}{0.82644628} \\ \\ &=& 0.45746692 \end{array}$

## Frequency Models

Frequency models count the number of times an event occurs.

1. The number of customers to arrive each hour.
2. The number of coins lucky Tom finds on his way home from school.
3. How many scientists a Tyrannosaur eats on a certain day.
4. Etc.
This is in contrast to a severity model which measures the magnitude of an event.
1. How much a customer spends.
2. The value of a coin that lucky Tom finds.
3. The number of calories each scientist provides.
4. Etc.
The following distributions are used to model event frequency.  For notation, $p_n$ means $Pr(N=n)$.

## Poisson:

$\begin{array}{lr}\displaystyle p_n = e^{-\lambda} \frac{\lambda^n}{n!} & \lambda > 0 \end{array}$
Properties:
1. Parameter is $\lambda$.
2. Mean is $\lambda$.
3. Variance is $\lambda$.
4. If $N_1, N_2, ..., N_i$ are Poisson with parameters $\lambda_1, \lambda_2, ..., \lambda_i$, then $N = N_1 + N_2 + ... + N_i$ is Poisson with parameter $\lambda = \lambda_1 + \lambda_2 + ... + \lambda_i$.

## Negative Binomial:

$\begin{array}{lr} \displaystyle p_n = {{n+r-1}\choose{n}}\left(\frac{1}{1+\beta}\right)^r\left(\frac{\beta}{1+\beta}\right)^n & \beta>0, r>0 \end{array}$
Properties:
1. Parameters are $r$ and $\beta$.
2. Mean is $r\beta$.
3. Variance is $r\beta\left(1+\beta\right)$.
4. Variance is always greater than the mean.
5. Is equal to a Geometric distribution when $r=1$.
6. If $N_1, N_2, ..., N_i$ are negative binomial with parameters $\beta_1 = \beta_2 = ... = \beta_i$ and $r_1, r_2, ..., r_i$, then the sum $N = N_1 + N_2 + ... + N_i$ is negative binomial and has parameters $\beta = \beta_1$ and $r = r_1+r_2+...+r_i$.  Note: $\beta$‘s must be the same.

## Geometric:

$\begin{array}{lr} \displaystyle p_n = \frac{\beta^n}{\left(1+\beta\right)^{n+1}} & \beta>0 \end{array}$
Properties:
1. Parameter is $\beta$.
2. Mean is $\beta$.
3. Variance is $\beta\left(1+\beta\right)$.
4. If $N_1, N_2, ..., N_i$ are geometric with parameter $\beta$, then the sum $N = N_1+N_2+...+N_i$ is negative binomial with parameters $\beta$ and $r = i$.

## Binomial:

$\displaystyle p_n = {{m} \choose {n}}q^n\left(1-q\right)^{m-n}$
where $m$ is a positive integer, $0.
Properties:
1. Parameters are $m$ and $q$.
2. Mean is $mq$.
3. Variance is $mq\left(1-q\right)$.
4. Variance is always less than mean.
5. If $N_1, N_2, ..., N_i$ is binomial with parameters $q$ and $m_1, m_2, ..., m_i$, then the sum $N=N_1+N_2+...+N_i$ is binomial with parameters $q$ and $m = m_1+m_2+...+m_i$.

## The (a,b,0) recursion:

These distributions can be reparameterized into a recursive formula with parameters $a$ and $b$.  When reparameterized, they all have the same recursive format.
$\displaystyle p_k = \left(a+ \frac{b}{k}\right)p_{k-1}$
It is more common to write
$\displaystyle \frac{p_k}{p_k-1} = a+\frac{b}{k}$
The parameters $a$ and $b$ are different for each distribution.
1. Poisson:
$a = 0$ and $b =\lambda$.
2. Negative Binomial:
$\displaystyle a = \frac{\beta}{1+\beta}$ and $\displaystyle b = \left(r-1\right)\frac{\beta}{1+\beta}$.
3. Geometric:
$\displaystyle a = \frac{\beta}{1+\beta}$ and $\displaystyle b = 0$.
4. Binomial:
$\displaystyle a = -\frac{q}{1-q}$ and $\displaystyle b = \left(m+1\right)\frac{q}{1-q}$.
Pop Quiz!
1. A frequency distribution has a = 0.8 and b = 1.2.  What distribution is this?
Answer: Negative Binomial because both parameters are positive.
2. A frequency distribution has mean 1 and variance 0.5.  What distribution is this?
Answer: Binomial because the variance is less than the mean.