Model Suitability Analysis
Knowing the importance to our clients of understanding and applying catastrophe models in a robust way, GC Analytics has devised a unique service proposition: model suitability analysis (MSA). With this service, our clients gain a superior informed position when deciding how to reflect the catastrophe risk of a portfolio within their risk management frameworks. The service can be comprehensive and generally includes:
- Our detailed understanding of each vendor model, including their strengths and weaknesses and our in-depth understanding of each client’s portfolio to ensure the results are fit for purpose.
- Sophisticated analysis and modeling techniques to help move from a “vendor” result to a client’s bespoke curve. This could involve analysis such as recalibration of lower return period results using historic experience for lower return periods; inclusion of un-modeled perils in the results; building in assumptions for growth or other changes in the portfolio and, to the extent advisable, “turning the dials” with respect to frequency, severity or other catastrophe modeling assumptions (including the impact of clustering events).
- Help for our clients in taking more ownership of how the results are used and stress and scenario testing of the available model options, settings and results.
- Detailed model change and year-on-year bridging analyses.
- Advice on introducing the effects of uncertainty into the risk management decision process.
The integration of our MSA process allows clients to understand how different models respond to their portfolios, what changes have occurred and why, what the implications are for capital and how the process can be improved going forward using a leading assessment of catastrophe risk.
The result is a bespoke view of catastrophe exposure that is much more specific, with less direct reliance on any one vendor model, and is easily understood and justifiable to management and the regulator.
Comprehensive data auditing and cleaning are essential in a Solvency II environment. Prior to modeling, our data audit and cleaning is an iterative process in which we work closely with our clients to ensure all parties understand the extent of data quality and completeness, as well as the possible implications for model outputs. We can employ a number of checks to ensure the data is cleansed and presented in the most appropriate way, including comparisons with industry databases, peer reviews against similar companies, automated logic checks and using satellite technology to refine information on high-value locations. Through Guy Carpenter‘s collaborative approach our clients still own the process and sign-off on the data and modeling assumptions in advance of modeling.
The catastrophe modeling team in GC Analytics has been helping clients with data issues since catastrophe models first came into use. The increased focus on data that has evolved from Solvency II has led many companies to review the underlying causes of data issues and implement plans for improvement. We offer a data trouble-shooting service which includes a comprehensive review of all stages in the data chain: from the policy proposal stage to system extraction and use in catastrophe models. The deliverable is a detailed report with recommendations for change. Indications on the costs and benefits of any changes are included based on data importance hierarchies derived from sensitivity testing of the likely effect changes will have on model output.
Education and Training
We have been helping our clients understand catastrophe models for many years. The demand for catastrophe model training, undoubtedly driven by Solvency II preparation, has increased over the last couple of years, leading us to formalize our training program into a well-defined curriculum. And in 2012, we will offer post-training certification from the GC Academy. The program is split into three levels according to the detail required: senior management, technician and practitioner.
For senior managers the one-day or half-day program can be delivered in person or via webinar. The course content focuses on high-level principles including (but not limited to):
- Catastrophe modeling history and evolution
- The importance of data in the context of catastrophe modeling
- The operation of catastrophe models, including the constituent modules
- The differences among the available vendor models
- Model results drivers
- Typical catastrophe model output and how to interpret it
- How to go from numbers to decisions
- How and why models change and how to incorporate this into decision making
- The limitations of and uncertainties within catastrophe models, un-modeled perils and how to account for them
For technicians, the two-to-three day training program encompasses the above in more detail and also includes:
- Analysis of the underlying hazard science
- Model sensitivity analysis
- Detailed comparisons of the vendor models
- Insight into the financial module calculations, including primary and secondary uncertainty
- Portfolio correlation
- Model calibration and validation
- Incorporating catastrophe model output into capital modeling software
For practitioners at companies who will actually be licensing and running the models, we supplement the training provided by the vendor modeling companies and offer hands on experience running models in tandem in-house with GC Analytics team members.
Catastrophe Model Documentation from Third Party Vendors
As a big customer of the vendor modeling companies, we are currently in discussions with them about providing documentation in advance of Solvency II implementation. The confusion from some vendors around what will be released and their stances on the ability of brokers to share information with non-license holders is not helpful to our clients’ preparations: we will be lobbying on behalf of our clients to achieve a practical solution that is acceptable to all.
Catastrophe Modeling Documentation
Guy Carpenter has developed a framework for documenting catastrophe modeling in the context of Solvency II. This is by no means final, as the guidelines and requirements continue to evolve. It is useful, however, as an outline when we are working with our clients to prepare for the regime.
The comprehensive start-to-finish description of the process is of course bespoke to each client but in general contains:
- Project plans, process systems and controls
- Data policy statement and audit reports
- Data quality assessment and a description of the implications for the modeling process
- Model options and setting assumptions and rationale
- Model results
- Model interpretation and validation
- Multi-model approach and rationale
- Conversion to internal model inputs
- Description of application within the internal model
We are well-positioned to provide a peer review of the documentation to support the catastrophe modeling process and use of the results in an internal model.
In addition, the ability to index, monitor version control and mine and present this information at varying levels of detail is vital. To do this we have been carrying out a pilot exercise for catastrophe modeling within the Author-it® documentation platform. Author-it is a documentation database platform originally developed for sophisticated manufacturing processes and machinery user manuals. It allows documentation to be built up from small discrete pieces of information provided by many users with full audit tracking ability. The construction of the database allows summaries and compilation of the underlying information at many levels and publication to many media platforms. In partnership with Author-it, Guy Carpenter is building off-the-shelf Solvency II documentation solutions, and catastrophe modeling is the first such project to be available to our clients.
See also: Guy Carpenter: Managing Catastrophe Model Uncertainty: Issues and Challenges, December 2011.
Click here to register to receive e-mail updates >>