We have recently opened the contributions to this blog to start-ups accelerated by our Shake’Up project. Hackuity rethinks vulnerability management with a platform that collects, standardizes and orchestrates automated and manual security assessment practices and enriches them with Cyber Threat Intelligence data sources, technical context elements and business impacts. Hackuity enables you to leverage your existing vulnerability detection arsenal, to prioritize the most important vulnerabilities, to save time on low-value tasks and reduce remediation costs, to gain access to a comprehensive and continuous view of the company’s security posture, and to meet compliance obligations.
What are we talking about?
ISO 27005 defines a vulnerability as “a weakness of an asset or group of assets that can be exploited by one or more cyber threats where an asset is anything that has value to the organization, its business operations and their continuity, including information resources that support the organization’s mission”. For the SANS Institute, vulnerability management is “the process in which vulnerabilities in IT are identified and the risks of these vulnerabilities are evaluated. This evaluation leads to correcting the vulnerabilities and removing the risk or a formal risk acceptance by the management of an organization”. Over time, Vulnerability Management has become a fundamental practice in cybersecurity, and now all industry professionals would agree to say that it is an essential process for minimizing the company’s attack surface.
Source: https://blogs.gartner.com/augusto-barros/2019/10/25/new-vulnerability-management-guidance-framework/
Nowadays, vulnerability management is integrated into all the major security frameworks, standards, sector regulations, guides and good security practices (ISO, PCI-DSS, GDPR, Basel agreements, French LPM, NIS, etc.) and is even regulatory in some contexts. Every “good” corporate security policy includes a significant chapter on this topic. Many would consider that a necessary evil.
Vulnérabilités : état de la menace
However, in 2019, according to a study conducted by the Ponemon Institute[1], “60% of security incidents were [still] the consequence of exploiting a vulnerability that is known but not yet corrected by companies”. To illustrate the current extent of the phenomenon, let’s consider ransomwares, the main cyber threat of 2020 and probably 2021. Although ransomwares are generally spread through user-initiated actions, such as clicking on a malicious link in a spam or visiting a compromised website, a large proportion of ransomwares also exploits computer vulnerabilities. Thus, if we look at the top-5 most virulent 2020 ransomwares ranked by intel471[2], we can see that their “kill-chains” all exploit vulnerabilities (CVE).
Ransomware Name | First known occurrence | Known exploited CVE | CVE publication date | Patch / workaround | CVSS V2.0 Score |
Maze(aka ChaCha) | 05-2019 | CVE-2018-15982CVE-2018-4878
CVE-2019-11510 CVE-2018-8174 CVE-2019-19781 |
18/01/201906/02/2018
08/05/2019 05/09/2018 27/12/2019 |
12-201802-2018
04-2019 08-2018 12-2019 |
107.5
7.5 7.6 7.5 |
REvil(aka Sodinokibi) | 04-2019 | CVE-2018-8453CVE-2019-11510
CVE-2019-2725 |
10/10/201808/05/2019
26/04/2019 |
10-201805-2019
04-2019 |
7,27,5
7,5 |
Netwalker | 09-2019 | CVE-2015-1701CVE-2017-0213
CVE-2020-0796 CVE-2019-1458 |
21/04/201512/05/2017
12/03/2020 10/12/2019 |
05-201505-2017
03-2020 12-2019 |
7,21.9
7,5 7.2 |
Ryuk | 08-2018 | CVE-2013-2618CVE-2017-6884
CVE-2018-8389 CVE-2018-12808 CVE-2020-1472 |
05/06/201406/04/2017
15/08/2018 29/08/2018 17/08/2020 |
*-201404-2017
08-2018 08-2018 08-2020 |
4,39,0
7,6 7,5 9,3 |
DopplePaymer | 04-2019 | CVE-2019-1978CVE-2019-19781 | 05/11/201927/12/2019 | *-201901-2020 | 5,07,5 |
Source: Hackuity & National Vulnerability Database (https://nvd.nist.gov/)
It is worth noticing that such vulnerabilities have often been referenced by the NIST when the ransomware first appeared, sometimes for several years. Moreover, patches or workarounds have often been released in most cases. A recent CheckPoint[3] study confirms that the oldest vulnerabilities are always the most exploited. In mid-2020, more than 80% of the cyberattacks identified used a vulnerability published before 2017 and more than 20% of these attacks even exploited a vulnerability that had been known for more than 7 years.
This highlights the importance – even today – of rapid installation of security patch as a defense mechanism to minimize cyber risks. Therefore, it’s not surprising that Vulnerability Management – one of the oldest practices in cybersecurity – remains one of the major 2021 CISO challenges for Wavestone[4]. Does this mean that we should try to correct all the vulnerabilities? Let’s go back in time.
« Vulnerability Assessment » vs. « Vulnerability Management »
When they first appeared on the market at the end of the 1990s, the vulnerability management solutions worked similarly to an antivirus: the objective was to detect as many potential threats as possible. They were more commonly referred to as “vulnerability scanners”.
The volume of vulnerabilities then was relatively low compared to today. In 2000, the NVD identified about 1,000 new vulnerabilities over the year, compared to more than 18,000 in 2020.
A comprehensive and manual treatment of vulnerabilities was still possible at that time. Scanners provided a list of vulnerabilities, their relevance in the business context was analyzed by IT teams and a report was sent to business managers. Once the report was approved, administrators would fix the vulnerabilities and re-test to ensure that patches were properly implemented.
Source : National Vulnerability Database (https://nvd.nist.gov/)
Over the next two decades, the number of discovered vulnerabilities has increased steadily at first, then started to skyrocket in 2017, a trend that is still continuing today. In 2020, a record of more than 18,000 new vulnerabilities were published by the NIST. But no, the code quality is not worse than ever! There are several reasons behind the growing number of vulnerabilities being disclosed:
- Innovation and the accelerated digitization of business lead to an increase in published hardware and software products. In 2010, the NIST recorded 22,188 new entries in its CPE repository, including 1,332 new products and 406 publishers. In 2020, 324,810 entries (+1,460 %), 35,794 new products (+2,690 %) and 6,060 publishers (+1,490%) have appeared in the repository.
- Demand for faster time-to-market is driving vendors to shorten development cycles to release and sell products faster, even if it means saving on resources needed for quality assurance and security testing.
- Cybercrime has become a lucrative business. A growing number of vulnerabilities are now attributed to cybercriminals seeking new tools to support their attacks.
- At the same time, the number of experts and independent organizations involved in the research and disclosure of vulnerabilities is increasing. The democratization and industrialization of Bug-Bounty programs are not unrelated to this.
- And finally, with rare exceptions such as GDPR, in the lack of adequate legislation and regulations to protect consumer rights in the event of software vulnerabilities, the industry has no incentive to invest in safer products nor take responsibility for the damage caused.
However, the problem is not only the higher number of vulnerabilities identified in the NVD databases or other repositories. With the advent of ultra-mobility, home-office, cloud-computing, social media, IoT, but also the convergence between IT and OT, Information Systems have continued to become more complex and to expand, open up and multiply the number of their suppliers, …creating as many potential new entry points for cybercriminals.
At the same time, companies are deploying and operating a vulnerabilities detection arsenal that is continually growing and has become more mature in recent years, or even commoditized:
- Intrusion tests & red-teams,
- Vulnerability scanners: on the entire external and/or internal park
- Vulnerability Watch
- SAST, DAST & SCA: often directly integrated into development pipelines
- Bounty Bug Campaigns
All these detection practices are complementary and generally stacked in a best-of-breed approach to evaluate specific parts of the IS or SDLC. Unfortunately, it is often once the arsenal in place that the problems are obvious (non-exhaustive list):
- The heterogeneity in the deliverables’ formats: pentest reports in PDF or Excel files, results of scans in the tool own console, vulnerabilities on the bug bounty platform, …, often force the company to adopt a siloed Vulnerability Management approach. It’s the same for vulnerability scores, which in the end turns out to be a patchwork of CVSS and its multiple versions, proprietary scales and a clever (J) mix of the two.
- This results in the inability to prioritize remediation efforts globally due to a fragmented and heterogeneous perception of vulnerabilities stock.
- Managing volumes of data that have become far too large to be processed manually: it is not uncommon for a company that performs authenticated scans on its fleet to see the volume of vulnerabilities exceed several million entries in the scanner’s console.
- Difficulty in coordinating remediation actions: identification of the asset owner and the holder of a share, exchange of e-mails, progress monitoring, Excel reporting, etc…
- The frustration of the teams in charge of remediation, who do not have factual reporting reflecting the remediation effort on the company’s overall security posture.
Facing these problems, companies have no choice but to work on the implementation of processes that are often costly because they rely on manual actions, the development of ad-hoc tooling or an assembly of bits and pieces of solutions gleaned here and there. The lack of automation of this process is all the more absurd since it generally mobilizes rare and expensive cyber security experts on low-value tasks such as compiling data in Excel, endlessly searching for the right stakeholder or tracking email threads.
In its study “Cost and consequences of gaps in vulnerability management responses” (2019), the Ponemon institute estimates that companies with more than 10,000 employees spent an average of more than 21,000 hours (or nearly 12 FTEs) in 2019 on the prevention, detection and treatment of vulnerabilities. This represents a total of more than $1M for a very disappointing quality/price ratio.
The « patching paradox »
In theory, the best way to stay protected is to keep each system up to date by correcting each new vulnerability, as soon as it is identified. IRL, this task has become impossible due to the volume of vulnerabilities too large, the human or financial resources too limited, the existence of legacy systems, and the time of availability of the fix or operational constraints on patch deployment.
Ultimately, no matter how large or small an organization may be, it will never have enough human or financial resources to address all of its vulnerabilities. In fact, the mistaken belief that more people dedicated to addressing vulnerabilities equals better security is called the “Patching Paradox” in the industry.
To reduce the pressure to increase staff at a time when there is a shortage of qualified security experts, and to prevent Vulnerability Management from becoming a frantic and lost race to fix more and more vulnerabilities, organizations today need to determine which ones of their vulnarabilities should be addressed first.
After having seen in this first article the threat status and the current issues related to the management of vulnerabilities, we will see in a second article the new approaches to be taken into account to better manage vulnerabilities.
[1] Ponemon Institute – Cost and consequences of gapes in vulnerability management responses – 2019
[2] https://intel471.com/blog/ransomware-as-a-service-2020-ryuk-maze-revil-egregor-doppelpaymer/
[3] https://www.checkpoint.com/downloads/resources/cyber-attack-trends-report-mid-year-2020.pdf
[4] https://www.wavestone.com/fr/insight/radar-rssi-quelles-priorites-2021/