Friedland11.FreqSev
Reading: Friedland, J.F., Estimating Unpaid Claims Using Basic Techniques, Casualty Actuarial Society, Third Version, July 2010. The Appendices are excluded.
Chapter 11: Frequency-Severity Methods
Contents
Pop Quiz
If the reported CDF24→Ult = 2.50 then what is the %unreported at 24 months? Click for Answer
Study Tips
VIDEO: F-11 (001) Frequency-Severity Methods → 5:00 Forum
There are 3 frequency-severity methods in this chapter and they are more complicated than methods from previous chapters. This is the top-ranked chapter from the reserving material so learn it well. You can follow the same study strategy as in previous chapters but it will probably take a little longer.
- Study the examples. Do the BattleActs practice problems. Memorize the concepts. Work the old exam problems.
Estimated study time: 1 week (not including subsequent review time)
BattleTable
Based on past exams, the main things you need to know (in rough order of importance) are:
- counts x severities - calculate ultimate/unpaid amounts, adjustments for seasonality
- disposal rate method - calculate ultimate/unpaid amounts with trending and adjustments for court decisions
reference part (a) part (b) part (c) part (d) E (2019.Fall #20) disposal rate method:
- calc unpaidFreq/Sev vs Paid Devlpt:
- why Freq/Sev betterE (2019.Spring #15) counts x severities:
- calculate unpaididentify scenario:
- rptd claims downE (2019.Spring #20) tail severity: 1
- provide an estimatetail severity:
- selection of maturity age(2018.Spring #18) BattleActs PowerPack E (2017.Fall #18) counts x severities:
- calc ultimate (seasonality)trend in counts:
- explainseasonality:
- suggest diagnosticE (2017.Fall #20) ultimate frequency:
- select & justifyultimate severity:
- select & justifyultimate:
- B-F & Freq-Sev methodsE (2017.Fall #23) tail severity:
- provide an estimatetail severity:
- selection of maturity ageE (2017.Spring #16) IBNR:
- Freq-Sev methodads/disads:
- Freq-Sev methodE (2017.Spring #17) counts x severities:
- calc ultimate (law change)E (2016.Fall #20) disposal rates - good:
- Freq-Sev methoddisposal rates - not good:
- Freq-Sev methodadjustments:
- to Freq-Sev methodE (2016.Fall #22) settlement rate change:
- provide evidencecounts x severities:
- calc ultimateE (2016.Spring #19) does Freq-Sev work?:
- long-tailed linesdoes Freq-Sev work?:
- many reopened claimsdoes Freq-Sev work?:
- spike in high-severity clmsdoes Freq-Sev work?:
- change in case reservingE (2015.Fall #20) does Freq-Sev work?:
- general liabilitydoes Freq-Sev work?:
- reduced deductiblesunpaid claim estimate:
- suggest improvementE (2015.Spring #20) ultimate:
- use several methodsdo methods work:
- discussE (2015.Spring #21) disposal rate method:
- ultimatedisposal rate method:
- adjust for severity spikeE (2014.Fall #16) estimate of counts:
- assess reasonablenessE (2014.Fall #17) disposal rate method:
- unpaidlegislative reform:
- assess impactE (2014.Spring #16) ultimate counts:
- reins. attachment pointsE (2013.Fall #16) counts x severities:
- calc IBNRE (2013.Spring #19) IBNR:
- Freq-Sev method
- 1 For a detailed explanation of the solution to this problem, see forum discussion for (2019.Spring #20a).
Full BattleQuiz You must be logged in or this will not work.
In Plain English!
Example A: Intro to FS Methods
Alice likes to use "FS" as an abbreviation for Frequency-Severity so we'll follow her lead. The concept behind FS methods is super-simple. Recall:
- frequency = counts /exposures
- severity = losses / counts
I'm going to use the term "losses" in this chapter, not "claims". (The reason is that it's too easy to get "claims" confused with "counts") So, we'll refer to losses and counts (not claims and counts.) Recall also that in practice paid counts usually means the same thing as closed counts. That ignores the possibility of partial payments where a claim could be paid but not technically closed, but let's agree for our purposes here that paid & closed mean the same thing. (The assumption of no partial payments is touched upon in sample answer 2 of E (2016.Fall #20) and discussed further in this forum post.)
The simplest possible example of the FS method is this:
- ultimate counts for AY 2025 = 100
- ultimate severity for AY 2025 = $2,000
Then,
- ultimate loss for AY 2025 = 100 x $2,000 = $200,000.
This is because if you multiply the counts by the severity, the counts cancel out and the result is losses. Some people ask Alice why this isn't called a "count-severity" method instead of a "frequency-severity" method. Well, it's just one of those mysteries of the universe.
Example B: Method #1 - Developing Counts and Severities
Let's build on the idea from the previous section. Here, instead of being given the ultimate count and ultimate severity directly, let's go back one step and start with the reported count and reported loss triangles. Here are the steps in calculating the ultimate, but you could probably guess them without being told. It's pretty much just common sense.
- develop counts to ultimate
- calculate the severity triangle as losses/counts
- develop severities to ultimate
- calculate the ultimate loss as (ultimate count) x (ultimate severity)
That's all there is to it. There's really nothing new - it's just an application of the development method to a triangle of severities. Note that you can perform this method just as well with paid count and paid loss triangles instead. Recall that if settlement rates have changed then it's better to use reported data because reported data is not affected by changes in settlement rates. Conversely if there has been a change in case strength/adequacy then it's better to use paid data because paid data is not affected by changes in case strength.
Here's an exam problem based on this idea along with Alice's solution and a couple of practice problems.
- E (2019.Spring #15)
Since Alice has been through all of this before, she has a couple of pro tips for you:
Alice's Pro Tip for Maximizing Your Exam Score: For reserving questions, pay attention to whether they ask for ultimate or IBNR or unpaid amounts.
Comment: If the question asks for the unpaid amount but you stop after calculating the ultimate, you would likely lose 0.25 pts. This is obviously an unnecessary loss of points but under the pressure of the exam it's very easy to miss details like this. Train yourself to take 5 seconds and check that you gave them what they asked for.
Alice's Pro Tip for Optimal Time Management: You are usually asked about 1 specific AY. Save time by only doing calculations needed for that AY.
Comment: This pro tip is a little trickier to apply but can save you oodles of time. In real reserving work, you would calculate the ultimate loss for all years but on the exam they usually only ask for 1 year, often the most recent year. If you are observant you'll notice there are calculations you can skip because they pertain only to years you're not asked about.
Keeping Alice's tips in mind, see how you do with this one:
- E (2013.Fall #16)
I'm going to now ask you to look at another problem that uses the counts x severities idea, but there's a twist because the data triangle is organized by AHYs or Accident Half-Years and you have to take into account seasonality. So instead of rows in the triangle representing 12-month periods January-December, they now represent 6-month periods January-June, and July-December. The development method doesn't specifically treat AHY triangles any differently from AY triangles but sometimes the January-June rows develop differently from the July-December rows. And we know from earlier discussions on the development method that changing development patterns can affect the accuracy of the results.
The source text has a very detailed discussion of seasonality, but I think this exam problem demonstrates the idea more clearly. The seasonality twist is this:
- After you calculate the LDF triangles for counts and severities, it's very obvious that the January-June AHYs develop differently from the July-December AHYs. That means you have to make 2 LDF selections for each development period: one based on January-June LDFs, the other based on July-December LDFs. You then get 2 corresponding sets of CDFs. If you don't do that then one half-year will get LDFs that are too high and the other will get LDFs that are too low.
It is not a hard problem but is very good for demonstrating seasonality. The second problem below also uses seasonality.
This next problem requires an adjustment for a new law that will reduce claims by 25%, but that's something we've covered in earlier chapters. You just have to make sure you restate older data to take into account the benefit change.
- E (2017.Spring #17)
And last one for now, promise. Here, you have to decide whether to develop the paid count or reported count triangle to get the ultimate counts. In practice, you might develop both triangles to ultimate then make a selection based on a weighted average of the results, but they don't want you to do that here. There is a good reason to use only one of the paid or reported count triangles and not the other.
- E (2016.Fall #22)
These exams problems are also included in the quiz.
mini BattleQuiz 1 You must be logged in or this will not work.
Example C: Method #2 - Using Exposures and Inflation Rates
Of the 3 frequency-severity methods, the one we discuss in this section is the only one that actually uses frequency. The method from the previous section, as well as the method from the next section, uses counts, not frequencies. Just skim the next few paragraphs to get a general sense for the method, then look at the exam problems and Alice's solutions.
Anyway, here's the key reason for this version of the frequency-severity method:
Reason for Frequency-Severity Method #2: CDFs (Cumulative Development Factors) for more recent AYs can be highly leveraged.
Ok, but why is that a problem? Well, think back to Frequency-Severity Method #1 from the previous section. We used the development technique to develop counts and severities separately. We then multiplied them to get total loss dollars. But if the LDFs, and hence CDFs, for the most recent year or two are highly leveraged then the results may not be accurate. For method #2, we'll instead rely more heavily on older years which are more developed. The development method should be more accurate for these older years so starting with those older years (and then trending forward) should be more reliable than directly performing calculations on the more recent years.
Key Concept for Frequency-Severity Method #2: Estimate older AYs using the development method then trend the results forward to the more recent AYs.
Doing this avoids using CDFs for the most recent years that as we mentioned could be highly leveraged. Of course, that means we've now traded reliance on development factors for reliance on selected trends, but at least it gives us a choice, more options. If we have more trust in our trends than in our CDFs then this frequency-severity method should work better. The 3 relevant trend rates are:
- exposure trend
- count trend
- severity trend
The new element here is exposures. Recall the frequency and severity formulas from earlier:
- frequency = counts /exposures
- severity = losses / counts
There are many different types of exposures bases available but here are 2 common examples:
- payroll:
- → used in WC (Worker's Compensation)
- → frequency = counts / (payroll dollars)
- earned premium:
- → used in PPA (Private Passenger Auto)
- → frequency = counts / (on-level earned premium)
Calculating trended frequency involves trending both the numerator and denominator separately. (Note that the source text refers to the 3 trends as frequency trend, exposure trend, and severity trend, but in most of the exam problems related to this method, you're given a count trend rather than a frequency trend.)
Note on trending exposures: Sometimes exposures need to be trended and sometimes they don't. This is explained in Pricing - Chapter 4 - Exposure Trends
Anyway, here are the basic steps for calculating the ultimate with frequency-severity method #2: (values are trended to the same year, usually the most recent year, but not always)
Step 1
- trend counts for each year
- trend exposure for each year
- calculate trended frequency for each year
- = (trended counts) / (trended exposures)
- select a single trended frequency value
Step 2
- trend severity for each year
- select a single trended severity value
Step 3
- calculate ultimate
- = (trended exposure) x (selected frequency) x (selected severity)
Here's an exam problem that demonstrates the method. This problem uses payroll as the exposure base:
- E (2017.Fall #20)
And here's a minor variation on the method that uses earned premium as the exposure base. Trending is not done to the most recent year for this problem. You have to trend back one year to the next-most-recent year.
- E (2017.Spring #16)]
And here are a few more similar types of frequency-severity method problems:
- E (2015.Spring #20)
- E (2013.Spring #19)
These exam problems are also included in the quiz.
mini BattleQuiz 2 You must be logged in or this will not work.
Example D: Method #3 - Disposal Rate Method
The disposal rate method has a lot of steps. You just have to keep practicing it. I've provided a set of practice problems for you further down in PDF format. You can also get Excel versions of the disposal rate method in the BattleActs PowerPack. I think it helps to first solve them with pencil and paper using the PDF version even though this might not be how you'll have to do it on the exam. Once you understand it well, you can practice doing it in Excel. I suggest first printing out my solution and trying to follow it with the notes I've provided below. Then you can try the practice problems at the end of this section.
Here's the problem and solution from the examiner's report:
- E (2019.Fall #20)
And here's a more detailed solution than in the examiner's report. (There are also some practice problems further down.)
First, I want to define some abbreviations. That will make it easier to grasp and remember how the method works:
CPC Cumulative Paid Counts IPC Incremental Paid Counts UC Ultimate Counts CPL Cumulative Paid Loss IPL Incremental Paid Loss
CDR Claims Disposal Rate IPS Incremental Paid Severities
For reference, here are the steps as I've laid them out my solution:
Step 1 is easy because you're just setting things up. Note that you're often given either the cumulative triangles or incremental triangles and then have to calculate the other.
- Step 1a: calculate CPC & IPC triangles
- Step 1b: calculate CPL & IPL triangles
- Step 1c: calculate UC values (by developing CPC triangle to ultimate) *
- * Step 1c may not be necessary if the question provides ultimate counts. In the exam problem solved below, they do in fact provide this and that saves you a lot of time.
Step 2 is why this is called the disposal rate method. We use the disposal rate concept to project counts. The mechanics of this step are laid out in my solution. Calculating the CDR in step 2a is easy but projecting IPC (Incremental Paid Counts) in step 2b is messy. This is the most likely place for you to make a mistake. Practice it!
- Step 2a: calculate the CDR triangle then make selections for each development period (Claims Disposal Rate)
- → CDR = CPC / UC
- → selected CDR = judgmental selection for each column
- Step 2b: project IPC (to the lower right portion of the incremental count triangle)
- → The projections are done by distributing the remaining counts proportionately using the disposal rate you selected in the previous step.
Step 3 is the final step where we project severities. You can see now that the pattern for this method is the same as for other versions of the frequency-severity method. It all boils down to multiplying counts by severities.
- Step 3a: calculate the trended IPS triangle (Incremental Paid Severities) then make a trended severity selection for each column (you have to trend to the year the question asked about - usually it's the most recent year)
- → trended IPS = IPL / IPC x (1 + trend)(trend year - data year)
- → selected severity = judgmental selection for each column
- Step 3b: calculate final unpaid amounts for each development period
- → unpaid = (selected counts from Step 2b) x (selected severities from Step 3a)
Note the extra step in this problem that is not part of the disposal rate method. You had to adjust the unpaid amounts to take into account a court ruling that's expected to increase future claims payments by 20%, but in this problem it was easy.
This particular exam problem also has no severity trend which makes it quicker to solve. If you had been asked to incorporate a severity trend, you would likely have been given trend percentage (as I've done in the practice problems.) A harder version is having you estimate the trend percentage yourself. You would have do something like examine year-over-year changes down columns in the IPS triangle and make a judgmental selection for the trend. This is sort of trend estimate is discussed more in Chapter 13 - Berquist-Sherman Method.
Here are 2 practice problems on the disposal rate method.
Here's another old exam problem on the disposal rate method. The basic method is the same as above but in this problem they ask you to calculate the ultimate loss for all AYs, not just the most recent. That means you have to complete the whole lower right triangle for IPC and also detrend your IPS selections back to prior years. There's also another "trick" in how they want you to make your disposal rate and severity selections, which you can see from the examiner's report after you've attempted the problem yourself.
|
This last disposal rate problem is more challenging for a couple of reasons. A key piece of information is that disposal and inflation rates observed in the most recent year will continue. That means you don't have to make disposal rate or severity selections - you just use the latest diagonal from the CDR and IPS triangles respectively. The second challenging aspect of this problem is that you have to come up with the severity trend yourself. Since they tell you inflation continues as in the most recent year, you have to calculate the year-over-year change down the columns in the IPS triangle to see what the inflation rate is. (Note that here you first calculate the untrended IPS triangle first for all ages to determine the appropriate severity trend. Then you have to recalculate the IPS triangle but with trends.)
|
These exam problems are also included in the quiz.
mini BattleQuiz 3 You must be logged in or this will not work.
FS Method Concepts
As always, we need to know the assumptions underlying the method. If the assumptions aren't satisfied then the method may give inaccurate results.
FS Methods Key Assumptions: Development is stable, Count definition is consistent, Types of claims are homogeneous
- Development is stable means the counts and severities will have stable development triangles. This means the LDFs within columns are roughly similar - no increasing/decreasing trends, no outliers.
- Count definition is consistent means that counts are tallied according to either # of claimants or # of claims. For example, if car accident A had 1 person injured and car accident B had 3 people injured, then the # of claimants is 4 but the # of claims (# of accidents) is only 2. Obviously, if an insurer changed the definition, this could have a big impact on the count triangle.
- Types of claim are homogeneous means for example that all claims are liability claims or all claims are physical damage claims or the mix between these types of claims is conistent. (Remember from Chapter 7 - Scenarios 5 & 6 that when mix of business changes, development triangles may no longer be stable and methods that rely on stable development may not work.)
The frequency-severity methods are more complicated than some of the other methods we've studied (development, ECR, BF, Cape Cod) so we need to have a good reason to bother with them.
Question: identify advantages of the FS methods
- Insight into the claims process since frequency, severity, and disposal rates are explicitly considered
- Paid data only may be used which is an advantage if case strength/adequacy is changing (paid data is not affected by changes in case strength)
- Inflation can be explicitly reflected (thx MG!)
Question: identify disadvantages of the FS methods
- Complexity - more calculations means small inaccuracies can propagate, and also judgmental selections are required in multiple steps
- Paid data - disadvantage if settlement rates are changing because paid data may be distorted
- Inflation - final result is very sensitive to inflation assumption
And here's the quiz. Yay. :-(
mini BattleQuiz 4 You must be logged in or this will not work.
Full BattleQuiz You must be logged in or this will not work.
POP QUIZ ANSWERS
- %unreported = (1 - 1 / 2.50) = 60%
Bonus Question Answer
- Answer: severity trend = 3%. You'll then get almost exactly the correct ultimate loss for AY 2025 of 5,040.
- To see where this value of 3% comes from, first calculate the untrended IPS triangle (Incremental Paid Severity).
AY 12 24 36 48 2020 384 384 384 384 2021 396 396 396 2022 408 408 2023 420
- Then calculate the year-over-year changes down each column.
AY 12 24 36 48 2020 -- -- -- -- 2021 3.1% 3.1% 3.1% 2022 3.0% 3.0% 2023 2.9%
- The trend is not quite constant over time but it's obvious that 3% is a better choice than the 5% given in the problem. (Must have been Ian-the-Intern who originally came up with the 5%.)