The modeling of emerging and casualty catastrophe risks remains challenging and the models continue to vary in their approach, level of development and industry acceptance. With the potential scenarios numerous, diverse and constantly changing, there is no single model or approach that could contemplate all of them. Furthermore, the various disaster scenarios with which carriers are being increasingly confronted needs to be prioritized and synthesized within their enterprise risk management framework. By their very definition, there may be limited data on hand on which to base any modeling. As a result, much of the industry continues to rely on multiple models and actuarial approaches that encompass model applications, probable maximum loss (PML) estimates, realistic disaster scenarios, experience and exposure ratings to create a broad set of scenarios and deterministic views.
In addition to peril- and scenario-based commercially available catastrophe models, niche data best practices and models are being developed to meet the demand in varying degrees within the technological category. Here, new data and modeling applications are being synthesized and adapted within existing model frameworks allowing carriers to better underwrite and manage these risks. Other applications involve the identification and quantification of emerging “aggregating” exposure concentrations such as those resulting from global supply chain dynamics. Other niche models, such as Guy Carpenter’s MetaRisk® ReserveTM can focus on various “crystalizing” emerging threats emanating from the accumulation of systemic reserves over multiple years.
The Oasis Loss Modeling platform, of which Guy Carpenter is a member and supporter, will help facilitate further development of additional niche property catastrophe models by allowing independent developers to create and input various hazards, vulnerability and exposure elements. We believe that open-source platforms, such as Oasis, will lower the barrier of entry for academics and small specialist teams on innovating and developing models that will create more credible views of overall risk and the ever increasing number of emerging perils and cat risks.
The mapping and deterministic modeling of emerging risk scenarios has and will continue to play an important role in this area. Lloyd’s approach to emerging liability risks in some ways has been no different than what has been required of their syndicates to report on for well-established property risks. Specific realistic disaster scenarios (RDS) are required to quantify and model for specific earthquake, windstorms and even terrorism event footprints through a combination of licensed software (AIR, EQECAT, RMS), internally modeled or via maximum line estimates. With a relative shortage of these options and data available for professional, non-professional as well as multiple public and products-based liability RDS losses, a reliance on simpler market share or premium derived PMLs based on de minimis approaches has typically been the industry practice.
However, as the level of sophistication and tools for deterministic modeling capabilities increases, the next question that arises involves the more challenging leap toward a more probabilistic and holistic model approach. It is important to note that the A.M. Best rating agency introduced deterministic casualty catastrophe loss scenario modeling questions into its 2014 Supplemental Rating Questionnaire (SRQ). A.M. Best defines casualty catastrophes as “events, activities or products that result in a number of lawsuits from multiple plaintiffs alleging damages that impact multiple insureds, coverages and/or time periods.” Scenarios need to be identified uniquely by each carrier based on what it views its exposure to emerging casualty risk(s) to be. The expectation is that more sophisticated data, modeling and responses will be required going forward.
The availability of essential insured-level data on emerging and casualty catastrophe risks remains an important challenge that many carriers continue to work toward improving. Property catastrophe models that were developed during the 1980s contemplate highly granular and sophisticated geo-coded data that is readily available today and get interfaced with very specific and robust building construction and historical event sets. Casualty catastrophe modeling similarly requires exposure data related to the particular industries covered by their insureds within the portfolio. The variety of models that is beginning to emerge in this area differ according to the data they require, the approach taken as well as the specific scenario set(s) on which the development is focused. Some are taking a highly granular, data-intensive, bottom-up approach; whereas others may be contemplating a more general top-down attitude to the exposure data required. Some are loss experience-based and are considering an integrated historical event set, yet others are much more exposure-based.
The exposure-based models (such as Praedicat’s CoMetaTM) are dependent on generally accepted scientific and mass tort data. They also operate under the fundamental assumption that past losses and patterns may not necessarily be indicative and directly applicable to future emerging threats. As a result they tend to focus predominantly on products-based liability scenarios and their latent impact on bodily injury.