IBM’s Confluent Acquisition Is About Event-Driven AI

Wait 5 sec.

IBM is no stranger to AI, given its long history with its AI Watson project and countless other efforts. But today’s AI hits different.On Monday, IBM announced that it had begun the process of acquiring streaming data platform Confluent for US $11 billion, or $31 per share, chiefly for what it could bring enterprises in terms of supporting generative AI (GenAI).Today, enterprise data “is spread across public and private clouds, data centers and countless technology providers,” said IBM Chairman, President and CEO Arvind Krishna, in an investor call Monday. “With the acquisition of Confluent, IBM will provide the smart data platform for enterprise IT, purpose-built for AI.”Such a platform will “connect, process and govern data for applications and AI agents,” according to the strategic vision of Big Blue.Less than two months ago in New Orleans, Confluent held its annual Current user conference, where it shed some light about how AI operations have changed over time, and what may lie ahead.What Does Confluent Offer Enterprises?With its stewardship of the open source Apache Kafka project, Confluent provides a leading open source enterprise data streaming platform, one that can process and route data as it comes in, in real time, from multiple sources.Confluent’s portfolio includes:Confluent Cloud: A fully managed deployment of Confluent’s data streaming platform, based on Apache Kafka.Confluent Platform: The self-managed deployment of Confluent’s data streaming platform.WarpStream: A newer, usually lower-cost Bring-Your-Own-Cloud (BYOC) deployment model where the customer provides the storage (usually cloud) and Confluent manages the deployment.Confluent Private Cloud: A fully managed service that can be run in a self‑managed, private environment.In each of the above deployment models, Confluent makes it easier for enterprises to master Kafka through additional services and tooling to simplify operations, enhance security, ensure data quality and accelerate application development.The company has also invested a lot of effort into a Kafka-adjacent open source data processing technology, Apache Fink.Confluent, based in Mountain View, Calif., has approximately 6,500 clients, 40% of which are in the Fortune 500, the company has estimated. Anthropic, Amazon Web Services, Google Cloud Platform, Microsoft and Snowflake all count themselves as Confluent customers.IBM’s Strategic Plans for ConfluentReal-time contextual processing will be invaluable as enterprises roll out their GenAI projects, IBM is reasoning.IDC has estimated that more than one billion new logical applications will emerge by 2028, and many of these new apps will produce data that will need to be analyzed, preferably in real time, to reap maximum value.IBM told its investors that the Confluent purchase would accelerate IBM’s growth. Confluent has estimated that its own potential market, for real-time data processing, has doubled from $50 billion to $100 billion in 2025.IBM would press the Confluent stack into its own Data and Automation portfolio. The company did a similar deal earlier this year to acquire Datastax for its scalable NoSQL database (and associated AI tooling).How IBM sees Confluent (IBM investor slide), a data streaming “pioneer” with robust enterprise connectivity.In addition to real-time processing, Confluent also offers an advantage in enterprise data management, IBM envisions.One of the issues that organizations face is the complexity of their data ecosystems, noted Confluent CEO and Co-Founder Jay Kreps, in a press conference at Current.A streaming platform is one way to put all of these sources of data in the same place.“If you talk to customers, they’re actually often very frustrated with the complexity that has come out of the cloud provider offerings, where it’s like they have 500 products all from the same [company], but those 500 things don’t work together,” he said. “Where we’ve seen success is taking this problem of working with real-time data, and bringing together as many pieces of the puzzle as possible.”The Rise of Event-Driven AIThe purchase is about IBM enhancing its “technical acumen” around AI, said Subbu Iyer, CEO of real-time AI database company Aerospike, in a statement.“AI models are hungry for more fresh, relevant, real-time data. Getting the right real-time data and putting it to work fast is the new competitive advantage,” Iyer said.“Every AI problem is essentially a data problem.”— Sean Falconer, ConfluentIn one talk at Current, Sean Falconer, Confluent’s senior director of AI product, explained how AI has changed over time and how this has changed the operational requirements.Models were once stagnant things, purpose-built to predict actions within one relatively static domain. GenAI overthrew that model, however. This approach uses a general foundation model, but is continuously augmented by fresh information, via prompts and other contextual data.”It has different meanings in terms of how we think about data requirements,” Falconer said.Many problems that enterprise customers want to tackle today through AI cannot be defined through old-school purpose-built models. An airline chatbot, for instance, may pull data in from flight scheduling systems, weather systems, customer data, ticketing systems and untold other systems. It is all assembled on the fly.This context engineering “can be a real challenge for businesses,” he said.“Traditional software engineering [is] all about writing code and the data itself doesn’t change the business logic, but in the world of essentially probabilistic models, the data is the input that essentially ends up changing or manipulating the business logic,” Falconer explained.Anomaly detection and product personalization have similar real-time challenges.How Kafka Enables Event-Driven ScalabilityKafka, an enabler of event-driven architecture, also addresses the scalability challenges organizations may one day face. With such a proliferation of agents that will soon be upon corporate IT, any support platform would be run most effectively in an event-driven or microservices architecture, such as Kafka’s, Falconer argued.But what are agents, if not microservices in disguise?“When it comes to the idea of these ‘event-driven agents,’ they really serve as the eyes and ears in your organization. You know, we talk about data streaming as being the central nervous system. The sensors of that world are all of the different places where this data originates in real time.”The post IBM’s Confluent Acquisition Is About Event-Driven AI appeared first on The New Stack.