Analyzing the footprint of recent Chrome and Windows zero-days discovery

Analyzing the footprint of recent Chrome and Windows zero-days discovery

This article was co-authored by Alejandro Sauter and Trishala Neeraj.

In late August, Palo Alto Networks stated that “80% of exploits publish faster than CVEs” by analyzing publicly available datasets. This means threat actors are able to abuse vulnerabilities before they are analyzed and given a score (and record number) by NIST. Although it's always been important to patch quickly, if exploits are available to attackers before defenders can identify vulnerabilities in a standardized way, this places additional pressure to patch quickly in order to reduce exposure time. 

On October 20th we learned that Google released a patch for Chromium-based browsers (Chrome, Edge, Opera, Vivaldi, Yandex, and others). The patch was aimed at patching a zero-day vulnerability which had been discovered and had been exploited in the wild. The vulnerability, now tracked as CVE-2020-15999, basically allowed attackers to run malicious code within the browser. Google also disclosed it had found a second zero-day, now tracked as CVE-2020-17087 and affecting Windows 7 through 10, which allowed attackers to escape the browser’s secure container and thus run the malicious code on Windows. On October 30th and after giving Microsoft one week to release a patch, Google fully disclosed their findings on both vulnerabilities alongside proof-of-concept exploit code. Microsoft then released a patch for this vulnerability on the November 10th “Patch Tuesday”. Since attackers needed the Chromium vulnerability first, followed by the Windows vulnerability, being proactive about updating browsers as soon as possible was key to reducing exposure time.

Who Was Affected?

To answer this question, we used CyberCube’s data lake, specifically data used in our “Web Traffic Signals” from our product Account Manager. This data consists of web traffic from devices around the world from which we can derive, among other things, the browser and OS used by end-users. Before we go into the footprint analysis, a few assumptions and details:

  • Only major browsers based on the affected Chromium version were searched for
    • Chrome up to 86.0.4240.111
    • Edge up to 86.0.622.51
    • Vivaldi up to 3.4 (2066.86)
    • Opera up to 72.0.3815.186
    • Yandex up to 20.9
  • All versions of Windows 7 through 10 (desktop only) were searched for
  • We utilized the month of October 2020, as the patch was released this month but we know attackers had been exploiting the vulnerabilities before then
    • Assumed the “pre-patch” period is Oct 01 - 19
    • Assumed the “post-patch” period is Oct 20 - 31
    • We discarded observations where traffic does not exist in both periods
    • We discarded organizations where < 100 observations exist for the month

Footprint Analysis

We analyzed a set of 18,797 organizations routing Internet traffic after using the aforementioned filters. From here, we divided the analysis into a few subsections answering different questions:

How many organizations were vulnerable in the “pre-patch” period? 

This question seeks to clarify the “initial state”, as we recognize that not all organizations are running the browsers and OS which are vulnerable. For this, we generated a proportion of observations using both vulnerable products over the entire traffic of the organization (where a higher proportion is worse). 

  • 68.5% of organizations (12,879) had a proportion of <50%, of which 225 had a proportion of 0% 
  • 14.4% of organizations (2,698) had a proportion between 50-75%
  • 11.6% of organizations (2,182) had a proportion between 75-95%
  • 3.4% of organizations (639) hada proportion between 95-99%
  • 2.2% of organizations (404) had a proportion of >99%

Although a majority of organizations had a lower vulnerable footprint, we observed a sizeable amount (17.2%) rely heavily on these two vulnerable (at the time) technologies.

How many organizations were vulnerable in the “post-patch” period?

This question looks into who remained vulnerable (or not) between the Chromium patch (released October 20th) and the end of the month (31st). For this, we generated a proportion of observations using both vulnerable products over the entire traffic of the organization.

  • 68.6% of organizations (12,898) had a proportion of <50%, of which 281 had a proportion of 0%
  • 14.4% of organizations (2,707) had a proportion between 50-75%
  •  of organizations (2,209) had a proportion between 75-95%
  • 11.8% of organizations (552) had a proportion between 95-99%
  • 2.4% of organizations (446) had a proportion of >99%

In this “post-patch” period, we actually observed quite a similar number of organizations in the same proportion buckets. However, this alone does not tell us the entire story on whether organizations patched or not.

How many organizations improved their proportion of vulnerable traffic observed?

After looking at the similarity between the “pre-patch” and “post-patch” numbers, we discovered these similar outcomes were the consequence of some organizations improving while others actually worsened their proportion of vulnerable traffic observed.

  • 48.2% of organizations (9,054) improved their proportion of vulnerable traffic observed in the “post-patch” period compared to the “pre-patch” period
  • 50.4% of organizations (9,472) worsened their proportion of vulnerable traffic observed in the “post-patch” period compared to the “pre-patch” period
  • 1.4% of organizations (271) experienced a 0% change 

The changes in proportion of vulnerable traffic observed can be further explained by two drill-downs:

How many organizations improved or worsened their proportion of vulnerable traffic observed by more than 15%?
    1. 78.7% of organizations (14,800) experienced a change of < +/-15%
    2. 10.8% of organizations (2,024) experienced a change of +15% (worsened)
    3. 10.5% of organizations (1,974) experienced a change of -15% (improved)

This tells us that within the halves of organizations improving and worsening, a majority (almost 80%) did not experience big changes whereas a small subset experienced large swings in their proportions. This was partly due to also large swings in the overall traffic detected from these organizations in the respective time periods, which lead to the second drill-down:

How many organizations improved or worsened their proportion of vulnerable traffic observed AND didn’t experience changes of more than 15% to their overall traffic?
    1. Important because large changes in overall traffic make it harder to identify true improvement or worsening by distorting the overall footprint of an organization
    2. Of our original 18,797 organizations, 70.3% of organizations (13,211) had a change of over +/-15% in overall traffic between the two time periods
    3. Of the remaining 5,586 organizations:
      1. 48.4% of organizations (2,702) experienced improvement, but only 6.1% (340) improved by >15%
      2. 50.9% of organizations (2,843) experienced worsening, but only 6.3% (351) worsened by >15%
      3. 0.7% of organizations (41) experienced no change

The second drill-down once again reinforced the idea that the majority of organizations didn’t experience large changes to their proportion of vulnerable traffic and only a small subset experienced these large changes. Nonetheless, even after filtering out organizations with large changes in overall traffic, we still observed close to a 50/50 split between those who improve and those who worsen.

The consistency of this split was an interesting finding for us. In the case of improvements, we could assume that patching is happening but it could also be due to older devices being phased out since we relied on a combination of both browser and OS. In the case of worsening, a number of explanations exist which could include upgrading from an extremely old OS (e.g. from Windows Vista to Windows 7), an increase in web traffic from vulnerable devices (e.g. more browsing), an increase in the amount of vulnerable devices on a network (e.g. a team or department suddenly returning to work), or other unexplored explanations.

Fun Facts

  • Only a single organization went from being 100% vulnerable to 100% patched
  • The biggest change in traffic between two time periods was of 437,000%
  • The largest number of observations from a single organization in a month was 875,016,341

Closing Thoughts

The original “in the wild” attacks may have been performed by the same group doing research to find these zero-days, which would likely imply high-cost and targeted operations (i.e. an APT). However, now that proof-of-concept code is released, it makes this accessible to a wider pool of attackers (e.g. cybercriminals). Depending on how quickly users patch, they may or may not be subject to attacks from this wider pool of attackers (even today!). The silver lining is that patching Chrome (for which there is a patch available and is easy to perform) would negate this particular killchain.

It is also worth noting that in March 2019, a similar situation happened when two zero-days targeting Chrome and Windows were also discovered and used in conjunction. As the most popular OS and browser combination, these technologies are a prime target for attackers and we will likely continue to see attackers research vulnerabilities in both, so we should continue being mindful of the potential aggregation these two technologies represent when used together.

Download Resource