October 23rd, 2012

Capital Models – What Lies Beneath

Posted at 1:00 AM ET

a-cox-finalAndrew Cox, Head of Advisory - EMEA
Contact

A robust capital model can be a great tool to help run a (re)insurance business. It is a given that capital models rely on a huge wealth of assumptions, and it is the quality of these assumptions that determine how useful the model is. There is an emphasis on those assumptions that are explicit, for example, catastrophe model outputs, premium rates and reserve volatility. But there is another type of assumption - that which is implicit. These assumptions can have a very material impact on the model results.

Implicit assumptions arise because any model is a simplification of reality - the complexity of an insurer is being reduced to relatively few numbers. Different modeling software packages will have different preferred approaches to making the necessary simplifications.

An example of an implicit assumption is the approach to modeling dependencies and diversification. Diversification is the fundamental concept that underlies insurance - not everything will go wrong at once. There is value in modeling dependencies because it examines an outcome where diversification has broken down - where several things do go wrong at once. An obvious example is a natural catastrophe. Understanding these potential aggregations is vital.

There are two approaches to modeling dependencies: the statistical approach and the causal approach. The statistical approach focuses on tools such as correlation matrices or copulas. The causal approach seeks to model the “cause and effect” temporal structure of the world through networks of interrelated events. Neither approach is inherently “right” or “wrong” but each will usually lead to different answers.

All software packages will plump for somewhere on the spectrum between all statistical and all causal. For example, MetaRisk®, Guy Carpenter’s capital modeling software, places more emphasis on causal modeling. We believe it is easier to understand and explain. It is also better suited to modeling contingent events, such as those following a large catastrophe.

So how should companies get a handle on the issue? By definition, it is difficult to assess the inbuilt assumptions a model makes within that model itself. The best solution is to build a new model, following a different approach in a different framework. This is not as expensive as it sounds - the second model will re-use most of the explicit assumptions that have already been made.

If this second model gives good agreement with the first, there can be more confidence that decisions informed by modeling are the right ones. If the two models disagree wildly, users may conclude that any previous confidence was in fact misplaced, and that capital modeling teams need to work harder to assure they are accurately reflecting reality.

It is for this reason that we are advising clients to go down the internal modeling framework route, where they have multiple models that individually focus on particular areas but collectively act as a series of checks and balances on each other. This is the next evolutionary stage in capital modeling.

Click here to register to receive e-mail updates >>

AddThis Feed Button
Bookmark and Share


Related Posts