AI adoption is soaring, yet many businesses still run on architectures in which their data is split into either operational systems (OLTP) or analytics systems (OLAP). This separation was linked to legacy infrastructure which made it difficult to run both day-to-day applications and analytical workloads on the same platform. However, that divide is now a source of operational issues, causing friction, waste, delays across teams and ultimately missed opportunities.
This split formed a disconnect: developers concentrated on keeping applications running, while analysts were left working with data that was often outdated or incomplete. Modern cloud architecture has alleviated some of the technical barriers, but the divide is still prevalent, sustained by legacy software, vendor lock-in and out-of-date working practices. Now is the time to rethink this model and move towards a unified data stack that is fitting for the world in which we are seeing a rise of AI agents and applications.
Addressing the legacy bottleneck
Once data exists in a transactional system, it becomes both tricky and costly to move. Proprietary storage formats and tightly coupled architectures restrict data inside operational systems and block integration with modern data and AI workflows. The result? Businesses end up working around infrastructure that no longer accommodates their needs.
Modern AI agents and applications require speedy and reliable access to live data. When operational data is stuck in legacy environments, it becomes much harder to enable automation, personalisation or real-time decision-making. Beyond slowing development, this also limits responsiveness, scalability and the ability to extract timely insights from rapidly growing volumes of data.
An increasing number of businesses are now, understandably, seeking alternatives that remove these barriers and offer a unified, responsive foundation to power modern data-driven systems.
Bridging the divide between operations and analytics
The original OLTP/OLAP split was reasonable in an era when computing capabilities were limited. Running analytics alongside operational workloads simply wasn’t possible. However, with cloud-native storage, such as open table formats, businesses no longer require separate pipelines to make operational data available for analytics. Yet, many organisations still rely on outdated architectures where operational data must follow the process of being extracted, transformed and loaded before it can be analysed, resulting in delays, duplication and overhead.
The consequences of this are not to be overlooked. Analyst’s base decisions on outdated information. Developers spend time maintaining fragile pipelines rather than building new capabilities. Innovation slows and opportunity costs mount.
In response, businesses are increasingly shifting to unified data architectures, where operational and analytical workloads share a single data foundation, utilising engines optimised for each specific task. This reduces complexity, improves efficiency and enables faster iteration — all of which are imperative benefits in the AI era.
Laying the data foundations for intelligent agents
AI agents are driving an impactful shift in application development; performing complex, multi-step tasks by reasoning over proprietary data and interacting with other components in real time. With the ability to coordinate decisions and actions throughout an entire data ecosystem, agents represent an evolution beyond basic automation to becoming the backbone of organisational operations.
For this shift to be both supported and successful, it is crucial that infrastructure evolves. AI agents must have low-latency access to live data, seamless integration across systems and modern development workflows. A new concept known as a lakebase resolves these issues. It delivers the reliability of an operational database and the openness of a data lake in a single, centralised place, so teams can run transactions and analytics without the challenge of juggling systems. It enables fast access to data, scales easily through separated storage and compute, and fits modern development habits like instant branching and versioning. Designed for today’s AI-driven workloads, a lakebase empowers both developers and AI agents to build, test, and ship applications efficiently and seamlessly without the constraints of old OLTP setups.
Forming the next generation of data platforms
It’s quickly becoming clear that a unified data stack will underpin modern systems. As AI becomes integrated into all aspects of a business, infrastructure that removes silos and unites operational and analytical systems together will be essential for powering teams to innovate and grow without previous limitations.
Legacy OLTP systems can’t keep up with what modern, AI-driven businesses demand, because of their fixed and complex architecture. Unified, open platforms that can both support transactional operations and real-time intelligence without compromise are crucial for AI-native applications.
This shift won’t be made instantly but organisations that start to reduce fragmentation, adopt open standards and build for agent-driven systems, will see success in the era of AI.
Dael Williamson
Dael Williamson is EMEA CTO at Databricks. He advises UK and EU start-ups and brings decades of experience in technology strategy, enterprise architecture, and AI-driven business transformation.


