Leveraging a data fabric for security to enable holistic cyber risk management
Jun 25, 2024
One of the critical components of data fabric is robust data ingestion. Done right, it provides correlated insights across all in-house and third-party data sources. This gives CISOs an unprecedented overview of their environment in real-time.
Guest contribution by Raanan Raz, Co-Founder and CEO of Avalor, a Zscaler Company
CISOs are often tasked with the formidable challenge of quantifying risk for boards and senior leadership. This task becomes even more complex when dealing with data scattered across tens or even hundreds of tools. Enterprises suffer from security data sprawl, rendering the data nearly unusable, for a number of reasons:
- A proliferation of distributed and siloed data repositories on premises and in the cloud
- Multiple formats and sources for both structured and unstructured data
- Increasing data privacy regulations that make protecting data painstaking and time-consuming
- Outdated approaches to data management that are not keeping up with digital transformation and innovation
With each individual cyber tool focused on its own domain, organizations lack a single source of truth (SSOT) about both the entities involved (assets, users, etc.) and an overall summary of risk. A SSOT, architected to rationalize and correlate millions of separate data points, provides organizations with a consolidated view of their risk profile, provides the context needed to prioritize response and remediation actions, and, ultimately, enables accurate reporting to the board on risk posture.
At the recent Zscaler Zenith Live ‘24 Conference, we explored how Avalor Data Fabric for Security™ and the Avalor Unified Vulnerability Management module built on top of that data fabric are revolutionizing security operations. The data fabric provides a single, aggregated, deduplicated, and harmonized data set that CISOs can tap to prioritize risk contextualized to the business, enable push-button reporting and dashboards, and automate remediation workflows. In fact, Gartner estimates that data fabric deployments will reduce data management efforts by up to 70% and accelerate time to value.
During the session, Deepen Desai, Zscaler’s Chief Security Officer, and I delved into the intricacies of this technology and demonstrated its transformative potential.
The genesis of data fabric
Our journey began with a fundamental question: How can we make sense of the overwhelming amount of data generated by modern security tools? The answer lies in our trusted data fabric— designed to unify, deduplicate, normalize, and contextualize data from over 150 connectors, including from Zscaler and third-party sources.
Traditional data lakes often fall short, as they tend to be repositories for unstructured data with little to no actionable insights. In contrast, our data fabric actively correlates and contextualizes information, transforming it into a coherent and actionable format. This approach not only provides a single source of truth but also empowers security teams to ask precise questions and receive accurate answers.
Building the foundation: Data ingestion and entity resolution
One of the critical components of our data fabric is its robust data ingestion capability. We’ve developed a technology called AnySource™️, which allows us to bring in data from any system, be it threat intelligence, application scanners, or even internal Microsoft Excel files. This data is then processed through our powerful entity resolution engine. In addition to AnySource, we’ve also created dozens of pre-built connectors to popular data sources.
Entity resolution is the first step to enabling effective use of data. It ensures that we can accurately identify and correlate assets, vulnerabilities, and risks across multiple sources. For instance, if we see an asset listed in both Tenable and CrowdStrike, our system recognizes it as the same entity and provides a unified view of its security posture. The same would be true of duplicative vulnerability data from different security tools.
Analyzing data: Making sense of all the info
Once the data fabric has resolved all the entities, it can then provide correlated insights across all the various data points. For example, if a vulnerability scanner reports on vulnerabilities across a set of endpoints, but the endpoint detection and response platform is missing some of those endpoints, the data fabric will identify those gaps in coverage. Another example could be a delta between the users identified in an Identity Access Management tool such as Okta and the users listed in the organization’s email security platform.
Operationalizing data: Moving from insights to action
The goal, of course, is to act on all the analytical findings the fabric creates. Operationalizing these findings includes activities such as creating reports and dashboards, measuring custom key performance indicators, triggering data hygiene steps such as updating CMDB, and automating remediation workflows via ticketing systems such as ServiceNow, Jira, and others.
The data fabric provides the foundational capabilities of data handling – ingestion, analysis, and actions. Modules will take advantage of the fabric capabilities to deliver key services.
Unified vulnerability management
A highlight of our session was the demonstration of our Unified Vulnerability Management (UVM) application, built on top of the data fabric. UVM provides correlated insights to prioritize risk, comprehensive reports and dashboards, and automated workflows for remediation.
UVM applies the data fabric’s ability to deduplicate entities to deduplicate vulnerability findings. Multiple tools may discover the same vulnerability – across a vulnerability scan such as from Tenable, a finding from Wiz, and a report from an application scanner. Thanks to the power of the data fabric, the UVM module correlates these findings into a single vulnerability and generates a single remediation task, significantly reducing the workload for security teams.
Another key feature of UVM is its real-time reporting capabilities. Security teams can access up-to-date information on the status of vulnerabilities, remediation efforts, and overall risk posture. This enables them to make informed decisions quickly and effectively, reducing the window of exposure to potential threats.
We will also be able to apply similar processes to findings or detections within the Zscaler Zero Trust Exchange and its integrated security solutions. After the data generated by the platform and integrations is ingested by the data fabric, the resulting insights can drive policy creation or changes in the Zero Trust Exchange.
Real-world applications and benefits
Correlating data from multiple sources and integrating it into a unified dashboard has several benefits. For one, this consolidation gives CISOs a comprehensive understanding of their organization’s risk landscape. A holistic view into risk enables organizations to prioritize remediation efforts based on real-time risk scores, the company’s own risk factors and mitigating controls, and realistic resource allocation. Another outcome of the enriched data set enabled by the data fabric is real-time visibility into the environment and continuously updated data. That dynamic quality is vital for maintaining compliance with data privacy regulations and governance frameworks. A third significant benefit of our data fabric is its scalability. As organizations grow and their needs evolve, our data fabric scales to accommodate the increased information volume and complexity, with the flexibility needed to add any kind of data type desired.
Future direction and enhancements
Looking ahead, we will continue to enhance the capabilities of our data fabric. We will focus on areas such as improving data unification processes, expanding the range of supported connectors, and introducing advanced search functionalities. We’re also focused on making our platform more user-friendly with features like no-code automation and sandboxing for testing changes in a controlled environment.
As we integrate more applications from the Zscaler ecosystem into our data fabric, the possibilities are endless. Imagine having a single pane of glass where you can see the impact of every security decision in real time, across all your tools and data sources. This comprehensive visibility is crucial for effective risk management and strategic decision-making.
In addition to these enhancements, we are exploring additional areas where we can tap artificial intelligence (AI) and machine learning (ML). These technologies have the potential to further enhance our ability to analyze and interpret data, providing deeper insights and more accurate risk assessments. By leveraging AI and ML, we can automate complex tasks, identify patterns and anomalies, and predict potential threats with greater precision.
Final thoughts
In the realm of cybersecurity, the ability to quantify and manage risk effectively is paramount. Zscaler’s work in Risk 360, with quantification of risk identified across the environment, will also be enriched by tapping into the data fabric. Having third-party data sources, along with first-party Zscaler data, inform risk quantification will expand accuracy and breadth of financial risk analysis for our customers.
The Avalor data fabric offers a revolutionary approach, providing CISOs with the tools they need to make informed decisions and safeguard their organizations. As we continue to innovate and expand our platform, we invite you to join us on this journey towards a more secure and resilient digital future.
Our commitment to innovation and excellence drives us to continuously improve our data fabric, ensuring it meets the evolving needs of our clients. We believe that by leveraging the power of a data fabric, organizations can achieve greater security, efficiency, and resilience in the face of ever-changing cyber threats.
What to read next:
Settle your risk score with AI and your data fabric
Why We Built Avalor and What it Means for Enterprise Security
An Experienced CISO’s Take on Solving the Data-to-Decisions Gap
Recommended