The obvious response to the issues emerging risks provide is to make sure reserves and capital position are more than robust enough for any eventuality - however remote - and then release them when the risks fail to materialize. But, there are many arguments against this as a practical strategy:
Posts Tagged ‘capital’
The chart below attempts to illustrate the solvency calculation issue. Suppose the best estimate is 20 and the assessment from modeling is that the 1-in-200-year ultimate loss is 100. If all else stays the same and with the simplifying assumption that the yield curve stays flat, one can say that the sum of the 1-year solvency capital requirements (SCRs) approximated the difference between 100 and 20 (i.e. 80). Yet, because of the discounting, when in time the change in own funds is recognized, is important. The black line represents a linear recognition pattern so the 1-year SCRs are all equal with increments of 10. The blue line represents a Binary Fast recognition so the first year SCR is 80 and the remaining years’ SCR are zero. This means that the deterioration is recognized quickly. The red line again shows binary recognition but with a slow pattern as the movement is only occurring toward the end of the liabilities’ life. The two curves in light blue and light red represent less severe versions of the binary forms.
As discussed in the Executive Summary of this report, the term “crystalization of risk” refers to the timescale over which we realize that the risk is manifesting itself and how this view changes until ultimate understanding of quantum is reached and all liabilities are discharged. The “Reserving Risks” section in last year’s report, Ahead of the Curve: Understanding Emerging Risks looked at how information emerges in the presence of reserving cycles. The profit or loss in any particular financial year is made up of not only the profit or loss from the same accident year but also any recognized changes in the reserves on prior years.
Reserving and Capital Setting: Sizing the Problem, Part III: Quantifying Emerging Risks; Expert Judgement
Data quality and availability should also be examined in depth. Because the risks are new, the data may not be captured correctly to power the model, which will lead to further uncertainty and may even preclude the use of a model altogether.
Once the risks have been identified and ranked, the next step is how to quantify the likely impact on the financial results of the firm. The first and most obvious question is what available quantification techniques are available for each risk on the list. This will depend on the availability of relevant data and commercially produced models.
Loss reserves are arguably one of the most difficult risks to estimate and monitor. In fact, inadequate pricing and deficient loss reserves have been the leading cause of property/casualty company impairments. According to A.M. Best, from 1969 to 2009 they triggered approximately 40 percent of all impairments - four times more than those emanating from natural catastrophes (1). There are many uncertainties in managing long-tailed, heavily legislated lines of business that can be triggered from emerging risks. Unforeseen inflation and anticipated legislative changes over a 10 to 30 year period present many demands. In order to prepare for emerging risk scenarios, future trends and related uncertainties need to be explicitly identified, contemplated and estimated.
In addition to internal risk management, models are typically used in risk transfer negotiations. Both traditional and alternative risk markets require extensive analysis of portfolios when considering risk transfer. Sharing a portfolio’s standardized model output is critical to imparting the loss potential of a particular portfolio from which risk-capital can be unlocked to support the risk financing needs of a reinsurance buyer. Using technology is critical when partnering governments with the private sector. Whether partnering with developed or emerging economies, these tools bring together the risk knowledge and historical data of the public sector with risk management techniques of the insurance industry. The result is an enhanced understanding of risk that provides stability and attracts partners.
Public sector-related data can be expansive, containing census data, property risk characteristics, historical loss information, risk rating matrices and natural hazard event scientific tracking. In order to facilitate packaging the sometimes unwieldy data in a way that is useful for risk decision making, utilizing outside resources to improve data transparency can be valuable. Public sector resources devoted to building tools that measure risks that are perceived as “uninsurable” can unlock private sector funding.
Mark Murray, Senior Vice President
Technology and innovation continue to change the world around us, creating both opportunities and new challenges for the (re)insurance industry. Advances in risk quantification such as predictive analytics and capital modeling, to name a few, are changing the way we underwrite, price and manage risk. Similarly, technology is allowing A.M. Best (Best’s) to advance the analytics of risk supporting its assessment of balance sheet strength. Taking advantage of stochastic modeling technology, the evaluation of risk within Best’s capital model is undergoing a fairly substantial overhaul to broaden the lens used to analyze risk relative to capital. The technology allows efficient production of multiple capital metrics adjusted for a range of risk levels rather than risk represented by just one data point, providing deeper insights into balance sheet strength, risk profile and risk appetite. The benefit of this overhaul will be a rating that provides greater differentiation among companies, a more informed dialogue around capital versus risk and a more concise measure of “excess” or “deficient” capital. This new lens on capital will significantly influence the way (re)insurers view, measure, communicate and possibly even manage risk.