Adopting signals into the cyber underwriting process can allow for better risk selection, as discussed in the previous blog all about fitting signals into your workflow. However, it’s difficult to integrate signals into your workflow when paired with the wrong data. Having a data and analytics solution in place that is made for cyber underwriters and flexibly aligns with their view of risk can help them to effectively underwrite risks, price policies, and ensure a profitable book of business.
CyberCube, a risk analytics provider dedicated to serving the cyber insurance market, implements many approaches to ensure rigorous research and validation are performed before customers interact with our signals, data, and analytics. At the start, with signal creation, each of CyberCube’s signals are tailored to the specific signal we are looking at, and scoped for what will be most relevant for that signal. Let’s dive deeper into how we develop our signals and how insurers can adopt them for effective cyber risk underwriting.
The data behind the signals
No matter the cyber risk analytics provider you use, it’s essential that you understand their data approach, and that they understand underwriting challenges in the insurance market. Cyber risk is a particularly complex kind of risk — meaning the data and analytics that underwriters use must be appropriate, relevant and accurate. CyberCube’s data and analytics ecosystem contains five broad classes of intelligence, each contributing to the creation and delivery of insurability analytics solutions:
- Internal, or Behind-the-Firewall
- External, or perimeter security and dark web
- Historical cyber incident and loss
- Firmographic industry
- Digital supply chain
Datasets in each of these categories help to shed light on specific aspects of an organization and its cyber risk exposure and posture.
Turning raw data into cyber risk signals
Having the right categories of data is only the first step to gaining access to the right signals. Mountains of raw data can overwhelm underwriters, so that data must be refined in the right way. Having data that is processed, attributed to a company, and ready to use can help underwriters save time and resources, allowing them to focus on profitable decision-making.
The data CyberCube uses is thoroughly collated and processed in order to glean meaningful information from it. Using a combination of artificial intelligence and machine learning techniques — as well as human experts — CyberCube can convert raw data into signals that can provide underwriters with actionable insights to inform risk decision making.
The monthly view of signals
The resolution of signals should be a hotly debated topic within the cyber insurance market. The resolution of data refers to the granularity of how data is measured and then subsequently presented, whether that be an hourly, daily or monthly basis.
While there are pros and cons to different signal resolutions, CyberCube is intentional about implementing a monthly signal view (that is a signal view that captures and presents a month’s worth of observations using more granularly captured data ranging from an hourly measurement or daily measurement). We believe that the monthly view is relevant for cyber underwriting for a number of reasons:
1. It fits the underwriting workflow that is not constantly reviewing updates and changes for a specific company. High resolution daily feeds are more relevant to that of a cyber technical professional who is monitoring and logging for problem areas that might need hands-on remediation. Instead, underwriters can access relevant signals when they need to in a consolidated view. This view smooths out the noise of a daily feed, and provides the bigger picture details underwriters are making decisions on vs. the minute details that are not fit for an underwriting purpose. (There is a use for a point in time sense check outlined below in ‘Application of on-demand signals’).
2. It standardizes the view into a single consistent normalized snapshot — of which multiple categories of signals are appropriate for. The underlying data used to build our signals can have a resolution and feed ranging from hourly to daily to monthly. Certain signals are highly volatile and change daily, like new CVEs and network port functionality. Others like deployed technologies, do not change that often and tend to be steadier over time as organizations tend to sign longer contracts with their providers, and do not necessarily make changes overnight. Understanding all factors on one piece of paper streamlines the workflow, and compares apples to apples.
3. It reduces noise in the feed and flow of data you receive. An observation can be seen one day but could have been scanned and detected for the brief few minutes that it was observable. If there was a deficiency, the company could have intentionally opened the port for testing and quickly closed it. CyberCube has a few approaches to deal with noisy signals in our monthly signal logic buildout.
Application of on-demand signals
More recent developments in the cyber insurance market have evolved the industry into taking interest in finer resolution on-demand insights. These on-demand insights are those with just-in-time recency and look at what is discoverable on a company's network at that most recent point in time.
This approach makes sense in a ‘trust, but verify’ evaluation process whereby an insurance company will be asking a company (the insured) if it has implemented patches and fixes for found vulnerabilities and weaknesses. Following the insured’s response, the insurance company will verify part or all of the findings, including if remediation efforts were made following the initial identification. With this approach, insurance companies are able to gauge an insured’s vulnerability management and remediation risk management controls, as well as hopefully clear up glaring deficiencies in the company’s cyber security posture.
However, this approach has limitations you should be aware of. For example, results from an on-demand or point in time evaluation are not necessarily statistically significant. In other words, the results that were found and remediated are not necessarily indicative of an organization’s wider cyber security hygiene. Implementing a resilient cyber security program takes time and effort, and does not necessarily happen over night. Tools need to be trained, programs and rules need to be tested, and both incidents and alleged ‘anomalies’ are likely. Insurers need to be certain that the single-risk analytics solution they are relying on uses appropriate methodology that provides relevant outcomes, and contextualizes what is being observed.
CyberCube’s benchmarking approach
We show both a binary observed/not observed logic as well as a percentile benchmark contextualization to understand how the signal stacks up against relevant comparative groups (whether that be a peer group, or the broadest bucket market of total observed organizations). To calculate these percentiles, we consider the unique nature of the observations on a signal by signal basis and formulate an individual signal scoring and ranking methodology to ensure the comparison is fit for purpose and normalizes the view.
For instance, we normalize the signal and observations by IP count for some signals. We take the distinct number of observations by IP count to help compare apples to apples for companies that might have an outsized number of IP counts, and therefore more observations of a deficiency on a raw count basis. This helps understand the relativity of a company with a smaller footprint that has a higher percentage of observations for its size.
This approach is not appropriate for every signal type. As an example, when selecting negative observation signals, like RDP Open Ports, we would not want to normalize the observation, as each additional observation is increasingly problematic for that organization.
Use of a time relevance weighted decay function
For certain signals, especially those with a necessarily high resolution, building a monthly signal enables us to incorporate more complicated algorithms that contemplate:
1. When the signal was observed
2. How persistent the observation was
3. How critical the observation was
For instance, we are able to weight observations that occur on the first day of the month differently from those seen on the last day of the month, and those that are consistently seen several days in row, vs. those one-off fluke observations that are quickly remediated or addressed.
Research into signals
Cyber security concepts can be extremely complicated. So complicated that even experienced cyber security professionals need to study and understand the implications and nuances that go into a specific field.
CyberCube’s team of cyber risk modeling experts devote their time to understanding these complicated nuances, monitoring new developments, and building signals and scenarios that distill these technical concepts into digestible measurements that can be applied in an insurance loss and mitigation context.
One such way to understand these signals is to start by mapping them to various cyber frameworks (such as NIST, MITRE, ISO, etc.). Doing this helps to contextualize the signal and understand how much ground it is covering with its measurement. In doing so, we are able to improve and refine the signal and its logic to ensure it is capturing something meaningful.
Another technique we employ is running statistical validation studies of our signals. Running these studies not only enables us to understand the correlation these have to insurable cyber incidents, but also enables us to see how these signals evolve over time, and identify when a signal needs further refinement or updating to capture the current threat landscape.
A holistic approach to signals
Insurers must use cyber risk analytics that implement a holistic approach to signals and round out an Underwriter’s assessment and view of risk. Underwriters need a cyber data partner who understands the cyber threat landscape, the cyber insurance industry and the challenges underwriters face in order to make effective underwriting decisions.
Not only should an analytics partner contextualize the right cyber security data in relation to the cyber insurance market, they should also justify the data framework they implement, including their signal development process. This will allow underwriters to have a full view of an organization’s risk profile and make informed choices that grow their bottom line.
The signals underwriters use should also have statistical significance, which Account Manager — CyberCube’s single-risk analytics platform — provides to show the cyber exposure and cyber security of an organization. Download our free report to learn more – Security Indicators
Generating Lift with Critical Factors.