Confluent says data streaming will enable faster, better decisions
Organisations are generating more data faster than ever before. With some estimates suggesting over 50% growth in the volume of data being generated month on month and storage now being measured in petabytes, being able to access that data to gain insights can be a significant competitive advantage.
It's no surprise that the rise of AI and the rapid pace of business change have resulted in boards seeing effective data management as a top-level strategic imperative.
Stephen Deasy, the CTO of Confluent, says companies are under pressure to take advantage of the data they have, particularly with AI and its capabilities. They want to understand how to take advantage of that data, identify real use cases that accelerate their business or provide customer value.
"Data has always been important," says Stephen. "But there are many challenges. Knowing what data you have is a start, but understanding what is useful and valuable is not always easy. And then there's the challenge of streaming data from systems that were built for a different time when overnight extractions and next day, next week or end of month reporting was the norm."
Stephen says we're moving to an age of software consuming software. There are autonomous agents that are continuously listening to signals, consuming data, making decisions and triggering actions. In that world, data must be streamed in a highly governed way to reduce the latency between the accumulation of data and acting on it.
That's a big shift from the era of monolithic, operational systems that were built to run the business. ERP, CRM and SaaS applications store data in isolated databases, with batch processes and ETL (Extract, Transform, Load) moving that data into analytical systems, including data lakehouses and business intelligence platforms.
"In that world, it was okay to take hours and days for processing because a human would review and process the data after. Today, we want to make decisions in real time with AI agents sitting in the system. Use cases such as fraud detection, anomaly detection and customer personalisation are demanding real-time flow of data as they drive actions that will impact the customer experience. The speed at which that data needs to be fed in and operated on has been reduced to milliseconds from hours and days," says Stephen.
One of the key benefits of data streaming is that the governance and security of the data is carried out at the source. Applications can stream data from the source without creating lots of duplication that requires more governance and security.
"The goal is to keep governance as far left as possible so that you're able to create the data products that you need. And that might sound like a grand idea that's hard to obtain, but it could be as simple as a Kafka topic with a schema and somebody that owns the governance of that schema. That now represents a data product that you can make available to your whole organisation. We've been investing a lot in the areas of Flink and Tableflow to enable streaming at the source so you can then take actions with the data," Stephen explains.
Confluent works with customers across almost every vertical ranging from banking and finance through to local government. By helping those organisations stream their data so they can make faster decisions and automate complex processes, they have been able to deliver significant value.
At Palmerston North City Council in New Zealand, Confluent was able to assist the council to take processes that took 25 manual steps to complete and automate 15 of those steps with AI agents, freeing up people to work on more complex problems to serve the citizens of that municipality. They've also assisted many banks across the world to detect and stop fraud. In one case, they assisted a financial institution shift from batch processing to real-time data streaming. This reduced the time for loan approvals by 40% and increased their Net Promoter Score significantly.
Moving to a world of data streaming will lead organisations to rethink their approach to software development. Stephen says there still is a place for traditional batch processing in some analytics applications, but they may need to reconsider their approach.
"Adapting to developing applications and processes for data streaming is not mutually exclusive from traditional approaches," he explains. "But you need to choose the right approach to develop an application that will deliver the outcomes you demand. Identify the use cases, such as fraud or anomaly detection, where a piece of real-time data could trigger an automated response so you're seeing real business wins straight away."
That approach enables organisations to build data products that are quickly available, properly governed, well-maintained and resilient. Once people see the benefits, they can leverage that experience and adapt without needing to start from scratch each time.
Data streaming that is secure and well-governed is imperative for organisations. The days of data being trapped in silos and only liberated after overnight data extractions and batch processing mean the full value of data is not being realised. Data streaming enables faster, better decisions at the speed of today's world.