At 5.13am on 18th April 1906, a devastating earthquake hit San Francisco. It was 7.9 on the Richter scale (the equivalent of a 1-in-200 year event) and the fire that followed brought the city to its knees. In terms of its impact, the then still small insurance industry was decimated.
The ripple effects of the learning from this event should be applied to the cyber insurance space if it is not to suffer a similar fate.
A panel session at the recent NetDiligence conference in Santa Monica posed the provocative question to the insurance industry: Cyber Cat: Are we ready? The short answer is “Yes, we can be”.
CyberCube’s Oliver Brew provided a short history lesson on the 1906 San Francisco earthquake, which happened uncomfortably close to the conference venue. He outlined the impacts on the relatively-immature insurance market at the time:
One insurance issue which echoes through the decades into today’s cyber insurance market is that each participating insurance company applied its own distinct policy terms and conditions. The inconsistency among clauses designed to limit fire liability resulting from an earthquake or building collapse proved to be particularly problematic in adjusting the losses on shared policies. There are distinct parallels in the way that data is defined today in insurance policies and specific exclusionary language relating to cyber triggers and types of business interruption losses.
The cyber insurance market has yet to experience a major catastrophic loss, but with lessons learned from the property insurance market, it can prepare for an event considered to be a case of not “if” but “when”.
In this context, it is not surprising that when asked the question “In a systemic cyber event, where is the cyber industry most likely going to have problems?” in an audience poll, the response was overwhelmingly “financial stability”. The insurance industry is key to helping the business environment survive in this new reality by taking on informed and managed cyber risk.
Hurricane Andrew highlighted the consequences of limited historic attritional loss assessment and ushered in a new and increasingly sophisticated methodology to modeling extreme weather events to improve capital adequacy and stability.
Analytical models based on data science and improved technology allowed for a new long-term view of potential catastrophe risk losses. Cyber risk management is now poised to take advantage of similar modeling technology. Matt Prevost, US product manager at Chubb defined a systemic cyber risk as a concentration of interconnected technology risks across sections of the economy. Losses for multiple policyholders can arise out of the same event and it has the potential for significant exposure outside of historic attritional experience.
The question “has there been a cyber catastrophe event?” was answered emphatically by Scott Stransky of AIR Worldwide. Simply put yes, pointing to the NotPetya cyber-attacks. These are estimated to have caused over $3bn in economic losses. And this does not consider the implications of the rapid expansion of the Internet of Things into non-affirmative cyber exposures. AnnaMaria Landaverde of Munich Re discussed the challenges of variable data accuracy within management information for reinsurance contracts. This makes it hard even to identify the exposures appropriately. Munich Re has invested significantly in this area and has publicly declared their intent to continue growing in this class of business. As Stefan Golling, CUO for Munich Re said during the Monte Carlo Rendez Vous reinsurance conference “if we shy away from the challenges of cyber, we risk that the insurance industry becomes irrelevant”.
While the role of modeling is the source of some debate, the panel acknowledged that the framework for looking at cyber exposures in different ways with a data-driven approach must be the foundation for future growth in this market. As Scott Stransky quoted the famous statistician, George Box, “all models are wrong, but some are useful”. When comparing the potential for systemic cyber exposures with other balance sheet risks, the challenge is that there is no “typical” catastrophe loss in the world of cyber risk and the profile of the exposure is changing so fast it is hard to keep up.
Jeremy Platt, Managing Director at Guy Carpenter noted that reinsurers have a wide variety of attitudes towards cyber insurance as a class of business, both acknowledging the opportunity for growth as well as expressing (at times) significant caution in the potential for systemic exposure. Unlike individual underwriting analytics, where the impact of effective risk selection can be a few points of loss ratio improvement, if cyber catastrophe risk is not managed appropriately it can mean the closure of a book of business.
While many of these questions will continue to be debated by the industry, the panel consensus was that models provide a framework to inform these questions relative to the risk appetite of companies and the industry against a rapidly growing threat. CyberCube’s Oliver Brew raised the importance of probabilistic return period assessment as a natural evolution, following the development of scenario-based disaster scenarios which have historically only focused on the severity of an event rather than the frequency as well. Platt agreed and highlighted the increased interest by insurance regulators about cyber risk systemic exposure management.
In cyber risk, history is not a predictor of the future, but we have the tools to learn from the past and enable informed decision making about capital management and balance sheet protection, fundamental issues for the continued growth of the insurance industry.