Employing an RDF graph as a semantic layer within ServiceNow’s massive installed base will inspire a whole new level of maturity for graph technology.
The Legacy POC Problem
Enterprise knowledge graph projects fail regularly. They fail for predictable reasons. Install takes forever. Data access becomes a political nightmare. The demo works beautifully on sample data but falls apart at scale. Or worse, it works fine but doesn’t solve a problem anyone was asking to solve.
The graph industry has been promising enterprise transformation for years, but deployments too often remain pilot projects that never escape the lab. There’s always another integration challenge, another data quality issue, another vendor claiming their approach is the “real” solution.
But ServiceNow’s acquisition of data.world in May might finally change this dynamic by embedding graph technology into infrastructure that thousands of enterprises already depend on.
The Data Gap Problem ServiceNow Is Actually Solving
ServiceNow understands something fundamental about enterprise data management that many graph vendors miss: there’s a structural responsibility gap between IT departments (who store and transport data) and business users (who make decisions with that data).
IT management commonly takes no responsibility for real company data beyond basic storage and transport. Organizing, transforming, and cleaning-up data? Those are problems for a small team of data engineers or data stewards, not IT.
ServiceNow’s Workflow Data Fabric directly addresses this gap by creating what amounts to a data fabric solution‚ one that “weaves together” disparate data from various sources and organizes it for business use through semantic intelligence rather than traditional ETL.
What ServiceNow Actually Bought
ServiceNow acquired data.world to deliver new data catalog and data governance solutions that allow customers to enrich data with meaning, context, and relationships, all while enabling AI agents and workflows to operate. This isn’t a standalone graph database; it’s semantic intelligence woven directly into ServiceNow’s Workflow Data Fabric.
As ServiceNow’s Gaurav Rewari put it: “The path that goes to that agentic AI heaven often goes through data hell.” The company’s solution isn’t to avoid that hell, but to build semantic layers tough enough to survive it. In fact, data.world is just the latest in a string of acquisitions since 2021 that clearly show the platforms early moves towards data analytics and, now, agentic AI.
The Workflow Data Fabric connects over 100 enterprise systems through “zero copy” connectors–virtualization through query re-writing–to platforms like Databricks, Snowflake, Oracle and AWS. The ServiceNow Knowledge Graph transforms raw data into contextual insights–connecting a company’s people, processes, operations, and systems so AI agents can work autonomously across the enterprise.
Crucially, this approach sidesteps the traditional graph deployment challenges I’ve discussed around virtualization versus materialization. ServiceNow isn’t asking enterprises to choose between storing data in graph format or virtualizing it–they’re creating a semantic layer that works with data wherever it lives.
ServiceNow believes that three pillars are necessary to achieve agentic automation goals:
- a performant database
- data access across silos
- data management and governance
If that sounds like the solution stack for every other graph-based platform these days, that’s because data management is a hot topic. Informatica’s CDO 2025 Survey found that 43% of chief data officers cite data quality issues as a barrier to AI adoption. ServiceNow is showing the market that graphs linking data sources together for agents is a key component of data management, not just data analytics.
Why Virtualization Matters Here
The data.world acquisition makes particular sense when you consider the practical realities of graph virtualization. As I’ve noted previously, virtualization is essentially query re-writing where the original query is re-written for a lower source database to temporarily acquire data and process it as if it were in RDF format.
At its best, virtualization promises to provide all the benefits of graph data without requiring data engineers to move data into graph storage. You can side-step building new data pipelines and avoid additional storage costs. But virtualization works best under certain conditions:
- When you’re not processing massive amounts of frequently updating transactional data
- When the underlying databases can handle the complex queries efficiently
- When you need complex pattern matching more than real-time updates
ServiceNow’s approach appears designed around these constraints. The Workflow Data Fabric handles real-time operational data while the semantic layer provides pattern matching and relationship discovery on top of existing enterprise systems. They’re not trying to replace transactional databases with graphs–they’re adding semantic intelligence where it makes sense.
Why Previous Graph Efforts Failed–And Why This Might Not
Enterprise graph projects fail because they start as technology evaluations rather than business solutions. They get bogged down in integration complexity, scale poorly, or don’t different enough from more familiar technologies to justify the cost of adoption.
But there’s a deeper issue: aggregation is not harmonization. Adding an aggregation layer is the go-to move for IT departments responding to business demands for faster data access. But aggregated data is rarely normalized or harmonized into a common schema because doing so requires intensive technical effort and agreement among many stakeholders.
Even if a standard is agreed upon and published, it’s promptly ignored by business users working with tools like Tableau, AirTable, or spreadsheets. Down in the data boiler rooms, application owners point out that their platforms control the data schema and they can’t change it.
ServiceNow is approaching this differently. They’re not asking enterprises to adopt graph technology–they’re building graph capabilities into software enterprises already use for critical workflows. The semantic layer has to work because workflow automation depends on it. And unlike traditional graph deployments, this isn’t about moving data or changing existing systems–it’s about adding intelligent interpretation on top of what’s already there.
The Technical Reality Check
Here’s what makes this interesting: data.world claims to deliver AI responses that are “4.2x more accurate” than traditional approaches. But the real test isn’t accuracy in isolation–it’s whether semantic intelligence can handle the chaos of real enterprise environments.
Consider the practical challenges I’ve outlined around graph virtualization. When virtualizing, you’re employing two data layers at runtime–not one. Performance depends on both layers. If your knowledge graph is performing queries that your SQL database doesn’t do well, you’re probably still asking the underlying SQL database to do something it’s not good at.
ServiceNow’s Workflow Data Fabric appears designed to handle this complexity. It’s described as “a unified semantic and integration layer that turns data into instant action by connecting structured, unstructured, and streaming data into ServiceNow to power your AI agents, workflows, and analytics with real-time, secure data from any source.”
That’s ambitious. It’s also exactly what enterprises need if they’re going to make AI work with their actual data, not cleaned-up demo datasets.
Why The Timing Matters
The knowledge graph market is estimated at $1.06 billion in 2024, projected to reach $6.93 billion by 2030. Gartner predicts that graph technologies will be used in 80% of data and analytics innovations by 2025, up from 10% in 2021.
More importantly, enterprises are finally hitting the wall with their current approaches. According to experts, 97 percent of workers are non-technical and unable to work with raw data, but they still need access to key data for business decisions. Organizations could reduce time spent on data projects by 70 to 80 percent when they arrive at common definitions and understanding of their data.
As I’ve observed from the field, enterprises are finding that after big transitions to cloud, acquisitions of SaaS applications, and deployments of data warehouses and lakes, they STILL cannot get information to decision-makers fast enough or with enough context to be actionable.
The Structural Advantage
ServiceNow’s bet on semantic intelligence is central to their AI platform strategy. If data.world’s knowledge graph approach can’t handle enterprise scale and complexity, it doesn’t just hurt one product–it undermines ServiceNow’s entire vision for agentic AI.
That pressure might be exactly what the graph industry needs. No more pilot projects that work perfectly in controlled environments but fail in production. This is graph technology being deployed at massive scale across diverse enterprises that need it to work reliably from day one.
Unlike traditional graph deployments that require enterprises to reorganize their data architecture, ServiceNow’s approach lets businesses set up rules for how data about concepts can be collected regardless of differences in classification, spelling, or data type in lower systems. These rules are set in a knowledge graph that focuses on connections between data points instead of individual cells in a table.
The Bottom Line
ServiceNow’s data.world acquisition might finally force graph technology to grow up–not through better marketing or more features, but through the brutal necessity of working at enterprise scale in production environments.
This isn’t another attempt to solve the aggregation problem with more storage. It’s an attempt to solve the harmonization problem with semantic intelligence. And unlike previous graph efforts that started as technology evaluations, this starts with an existing business platform that enterprises already trust.
If semantic layers can handle ServiceNow’s customer base–thousands of large enterprises with messy, complex data environments–then graph technology will have proven it belongs in mission-critical enterprise infrastructure. If not, we’ll have learned something important about the gap between graph technology’s promise and its practical limits.
The pilot phase is over. Time to see if graph technology can cover the data gap.
