April 29th, 2019

Methodological Considerations in the Statistical Modeling of Catastrophe Bond Prices

Posted at 1:00 AM ET

major_john_gcciJohn Major, Director of Actuarial Research


John Major has authored an article that aims to help actuaries, financial analysts, statisticians, data scientists and their clients better investigate how property catastrophe risk, and particularly catastrophe bonds, are  priced.

As in any technical discipline, statistical modeling has its customs, templates and default modes of operation, which are typically learned early in a practitioner’s education. These practices persist as “standard operating procedures,” to be modified as needs arise. The purpose of this article is to challenge those default modes as they have appeared in 21st century-published catastrophe bond pricing research.

The article addresses from a methodological perspective and frames in a business context the problem of specifying and fitting a statistical model of the pricing of property catastrophe risk. The framing provides a normative foundation for evaluating strategic and tactical decisions in the modeling process. Undesirable consequences emerge from a naïve application of ordinary least squares regression - a widespread default choice.

Mr. Major provides an example: “Consider a model that tries to predict two observed prices: 1 percent and 5 percent rate-on-line, but is slightly off, predicting 2 percent and 6 percent, respectively. Will the decision maker relying on the model consider these as equal errors?  Default assumptions say, yes; they both represent a 1 percent error and so are the same. But many people would say the first was off by a factor of two and the second only by 20 percent, so they are not equal errors at all. The decision as to how to make this type of comparison needs to be thought through and made explicit, not just by taking a default position.”

Additionally, the article covers such issues as measuring, comparing and aggregating errors; reflecting domain knowledge; the economic interpretation of model components; the treatment of time; random effects; and extrapolating beyond the data.

The article offers alternatives to default approaches, including weighted least squares with weights inversely proportional to capital requirements and alternative functional forms.

Click here to read the article as it appears in Risk Management and Insurance Review, published by the American Risk and Insurance Association in their Spring 2019 issue.

Click here for additional material from Guy Carpenter >>

AddThis Feed Button
Bookmark and Share

Related Posts