The answer to this question was to improve understanding risk aggregation and managing growth of cyber books throughout the insurance cycle. This was the dominant themes at the NetDiligence Cyber Risk Summit conference in Philadelphia last week. Risk quantification has become an embedded and critical process in the management of natural catastrophe risk. It is only natural the insurance industry and the regulatory agencies endeavor to match the success of this approach to lines of business exposed to cyber. But how? “Inside the firewall” company specific data is limited and managing against future cyber-attack scenarios seems daunting.
Data does exist.
How the market assesses and models the data to deliver both quantification of risk and transparency in the underpinnings of those models matters. On the NetDiligence panel “Quantification of Cyber Risk”, my colleague Matt Silley (qualified actuary at CyberCube), identified key areas where the use of cyber risk modeling tools not only deliver key risk metrics, but also efficiency in the underwriting process. According to Matt, the latter “allow for building a sense of the security posture of segments of the market” via the use of both behind the firewall and outside the firewall data.
Which leads us to…..data quality matters.
The panel recognized the issue of data quality for analyzing both contract and claims data which is a key component of understanding model uncertainty and validation. The use of artificial intelligence and machine learning in deciphering both claims and contracts is a new approach that is also being successfully implemented at CyberCube.
Finally…..unique approaches to modeled quantification of risk exist.
As we know, cyber risk is unique in that the technology and cyber vulnerability landscape is constantly evolving. Both data and expertise exist, in addition to modeling methodologies which acknowledge the inherent uncertainty in that risk. Having access to a broad spectrum of 100+ key cyber security and risk experts is a critical component to developing a model that allows for evidence-based management of risk. A structured approach to scenario development such as the military based “Kill Chain” approach used by CyberCube to address the probability and quantification of each component of an attack was also identified by the panel as an effective methodology.
To paraphrase Matt, all of these factors “allow for the ability to manage the possible with a credible range of results”.
The panel recognized the issue of data quality for analyzing both contract and claims data which is a key component of understanding model uncertainty and validation.