This article was co-authored by Alejandro Sauter and Trishala Neeraj.
In late August, Palo Alto Networks stated that “80% of exploits publish faster than CVEs” by analyzing publicly available datasets. This means threat actors are able to abuse vulnerabilities before they are analyzed and given a score (and record number) by NIST. Although it's always been important to patch quickly, if exploits are available to attackers before defenders can identify vulnerabilities in a standardized way, this places additional pressure to patch quickly in order to reduce exposure time.
On October 20th we learned that Google released a patch for Chromium-based browsers (Chrome, Edge, Opera, Vivaldi, Yandex, and others). The patch was aimed at patching a zero-day vulnerability which had been discovered and had been exploited in the wild. The vulnerability, now tracked as CVE-2020-15999, basically allowed attackers to run malicious code within the browser. Google also disclosed it had found a second zero-day, now tracked as CVE-2020-17087 and affecting Windows 7 through 10, which allowed attackers to escape the browser’s secure container and thus run the malicious code on Windows. On October 30th and after giving Microsoft one week to release a patch, Google fully disclosed their findings on both vulnerabilities alongside proof-of-concept exploit code. Microsoft then released a patch for this vulnerability on the November 10th “Patch Tuesday”. Since attackers needed the Chromium vulnerability first, followed by the Windows vulnerability, being proactive about updating browsers as soon as possible was key to reducing exposure time.
To answer this question, we used CyberCube’s data lake, specifically data used in our “Web Traffic Signals” from our product Account Manager. This data consists of web traffic from devices around the world from which we can derive, among other things, the browser and OS used by end-users. Before we go into the footprint analysis, a few assumptions and details:
We analyzed a set of 18,797 organizations routing Internet traffic after using the aforementioned filters. From here, we divided the analysis into a few subsections answering different questions:
How many organizations were vulnerable in the “pre-patch” period?
This question seeks to clarify the “initial state”, as we recognize that not all organizations are running the browsers and OS which are vulnerable. For this, we generated a proportion of observations using both vulnerable products over the entire traffic of the organization (where a higher proportion is worse).
Although a majority of organizations had a lower vulnerable footprint, we observed a sizeable amount (17.2%) rely heavily on these two vulnerable (at the time) technologies.
How many organizations were vulnerable in the “post-patch” period?
This question looks into who remained vulnerable (or not) between the Chromium patch (released October 20th) and the end of the month (31st). For this, we generated a proportion of observations using both vulnerable products over the entire traffic of the organization.
In this “post-patch” period, we actually observed quite a similar number of organizations in the same proportion buckets. However, this alone does not tell us the entire story on whether organizations patched or not.
How many organizations improved their proportion of vulnerable traffic observed?
After looking at the similarity between the “pre-patch” and “post-patch” numbers, we discovered these similar outcomes were the consequence of some organizations improving while others actually worsened their proportion of vulnerable traffic observed.
The changes in proportion of vulnerable traffic observed can be further explained by two drill-downs:
How many organizations improved or worsened their proportion of vulnerable traffic observed by more than 15%?This tells us that within the halves of organizations improving and worsening, a majority (almost 80%) did not experience big changes whereas a small subset experienced large swings in their proportions. This was partly due to also large swings in the overall traffic detected from these organizations in the respective time periods, which lead to the second drill-down:
How many organizations improved or worsened their proportion of vulnerable traffic observed AND didn’t experience changes of more than 15% to their overall traffic?The second drill-down once again reinforced the idea that the majority of organizations didn’t experience large changes to their proportion of vulnerable traffic and only a small subset experienced these large changes. Nonetheless, even after filtering out organizations with large changes in overall traffic, we still observed close to a 50/50 split between those who improve and those who worsen.
The consistency of this split was an interesting finding for us. In the case of improvements, we could assume that patching is happening but it could also be due to older devices being phased out since we relied on a combination of both browser and OS. In the case of worsening, a number of explanations exist which could include upgrading from an extremely old OS (e.g. from Windows Vista to Windows 7), an increase in web traffic from vulnerable devices (e.g. more browsing), an increase in the amount of vulnerable devices on a network (e.g. a team or department suddenly returning to work), or other unexplored explanations.
Fun Facts
The original “in the wild” attacks may have been performed by the same group doing research to find these zero-days, which would likely imply high-cost and targeted operations (i.e. an APT). However, now that proof-of-concept code is released, it makes this accessible to a wider pool of attackers (e.g. cybercriminals). Depending on how quickly users patch, they may or may not be subject to attacks from this wider pool of attackers (even today!). The silver lining is that patching Chrome (for which there is a patch available and is easy to perform) would negate this particular killchain.
It is also worth noting that in March 2019, a similar situation happened when two zero-days targeting Chrome and Windows were also discovered and used in conjunction. As the most popular OS and browser combination, these technologies are a prime target for attackers and we will likely continue to see attackers research vulnerabilities in both, so we should continue being mindful of the potential aggregation these two technologies represent when used together.