Data streaming platforms are becoming increasingly important in powering workflows for all types of organisations, including retailers that rely on real-time updates for pricing, inventory levels, shipment tracking, and more.

According to Confluent Australia and New Zealand Regional Director, Simon Laskaj, there’s no bigger industry that’s ripe for change, especially in data streaming, than retail, as cost of living puts pressure on budgets and increased cost of goods impacts cart size.

“If you want data in real-time, it’s about connecting it, moving it around and responding to it,” he told Retailbiz in a recent interview.

“When it comes to data streaming, retailers often start with real-time inventory, whether it’s replenishment of fruit and vegetables for a supermarket or stock management for a small business.

“Most organisations, especially small businesses, manage stock manually on paper, Excel spreadsheet or batch report on a sales system, on a weekly or monthly basis. The lightbulb moment is discovering the possibility of knowing where stock is at any time, but beyond that, having visibility over the entire supply chain, which has been a challenge for retailers for many decades.”

Despite increasing discussions around Artificial Intelligence (AI) over the last 18 months, Laskaj believes the ‘polish’ is starting to shift and fatigue is kicking in. But with this fatigue comes a greater level of realism centred around having trusted and truthful data.

“Retail businesses are utilising data streaming for real-time data across inventory such as point of sale systems and it’s now becoming predictive based on weather or seasonal events, for example. While predictive intelligence isn’t something new, it’s difficult to solve because often data is in disparate sources or silos,” he said.

Confluent Data in Motion Tour event 2024.

“If you can start to predict the state of your business and your inventory, leveraging both past data and models, and for small businesses, having inference pre-built models to leverage, I believe that’s the game-changer – industrialising predictive models.”

This unlocks the ability to further personalise the purchase journey to deliver an optimal customer experience and ultimately drive enhanced satisfaction and adoption.

“Lack of personalisation is evident across many businesses yet it’s low hanging fruit. Personalisation delivers tangible value to end customers, as well as internal stakeholders such as architects and engineers who aren’t customer-facing,” Laskaj said.

“Businesses often think about personalisation as a text message, notification or loyalty program that can deliver value. But when it comes to enhancing customer loyalty, it’s often multiple different applications including weather or internet logs as customers browse your website. Delivering personalised information in real-time as customers browse helps with basket conversion and brand loyalty.”

Forrester recently named Confluent as a ‘Leader’ in The Forrester Wave: Streaming Data Platforms and Cloud Data Pipelines, Q4 2023 reports. Beyond the leadership element, Laskaj said this recognises data streaming platforms as its own category which is a fundamental announcement to the market that it’s a critical part of a business with almost one million active users.

“I joined Confluent because I wanted to be part of something that I felt was an emerging technology and yet to be fully shaped with a well adopted level of maturity. But it’s no longer emerging, it’s arrived.

“When we talk about data streaming, it’s still akin to a technical audience, so I’m personally excited about the notion of Tableflow – which unifies operational and analytical estates – because it resonates with a business audience. If you can instantly attach to analytical applications – if you have a database storing data from the last 20 years – you can unlock a significant amount of value.

“Being able to access siloed data separate from traditional data streaming is going to be a game-changer. If you can stream or move the data, with a level of security and governance, and add more value to why it’s moving, there’s few organisations, vendors or applications that can do that,” Laskaj said.

Kmart case study

According to Laskaj, Kmart is one of the best use case examples of how a business has leveraged data streaming.

Kmart presented at the Confluent Data in Motion Tour in 2024, and also 2023, to discuss their architecture, design options and business outcomes.

“This year, they’re already talking about their learnings. They’ve designed the platform to drive the highest level of adoption with an optimal level of governance and security to manage who can access the data and put controls around it,” Laskaj said.

“Over the last 12 months, Kmart has discovered what data streaming is delivering to their business and how they present it back to the business. For example, Kmart can now deliver instant digital receipts through their point of sale, and this means instant data from in-store transactions, which feeds into real-time inventory information – what’s being purchased and how it’s being purchased.”

In 2018, Kmart was self-hosting the Confluent platform on its database infrastructure. There were two distinct capabilities built – the Sophia Data Platform and Chanakya Data Science Platform across 2019 and 2020. In 2021, Kmart migrated to Confluent Cloud in its pursuit of a SaaS and cloud first principle.

Kmart Principal Architect for Enterprise Technology, Duane Gomes said, “At that point in time, we matured Hamilton (the brand we use for offering our platforms) as a capability and a set of enterprise engineering platforms. We adopted and took over the Confluent platform from the data team and badged it our Hamilton event streaming platform predicated on Confluent Cloud. In 2023, the migration of Sophia and Chanakya to Hamilton was in full swing.

“Some of the things we are challenged with and working through include real-time stock on hand to drive order fulfilment, real-time inventory to support marketing campaigns, real-time search analytics to drive recommendations, real-time pricing to reflect markdowns, real-time stock location to facilitate replenishment, real-time container shipment tracking, real-time sales figures and real-time customer interactions. The analytics are important, so we know what’s happening within our systems in real-time.”

Kmart has business events occurring throughout the day, every day of the year, which means a lot of data is getting changed and notifications are being sent between applications.

“Our bricks-and-mortar business has transformed into a digital business, which is why being available in real-time is critical to serve customers all over Australia. When it comes to real-time interactions, customers know what they’re searching for, responding to marketing campaigns, placing sales orders, getting recommendations and asking questions via chat – all of these generate events,” Gomes said.

“This led us to why we’re doing what we’re doing, realising that our applications in their journey of modernisation had decided to move to event driven architecture in some cases and API-driven architecture in other cases. To facilitate that and allow applications and data to work together, event streaming became a capability we wanted our engineering teams to leverage.

“Given a lot of data is flowing through these systems, we need data pipelines to get the data from the systems of record where the data is born right to the data lake where it will be put together. To service our customers, we need to process, analyse, report and predict using data.”

Kmart Solution Architect for Enterprise Technology, Sachin Walia explained that Kmart has three technology business units – Enterprise Technology, Engineering Platforms and Integration Services – and under Enterprise Technology, a team engineers platforms to build capabilities.

“On top of that, under engineering platforms, we offer cloud hosting, data centre, developer technology and integration services platforms. Hamilton is the brand we use for offering our platforms and capabilities,” Walia said.

“With any platform we build, there are core capabilities we offer for any platform – scalability, performance, resilience, security, observability, access control, automation, self-service and principles and patterns. For integration services, we have various capabilities, but our modern integration architecture has three core capabilities – API management, file transfers and event streaming.”

Hamilton’s event streaming platform logical architecture considers key drivers including modern integration, event-driven and real-time streaming. Through this platform, Kmart has decoupled producers and consumers and used the core capabilities of Confluent like Schema Registry and stream processing to provide the event streaming capability.

“The physical architecture for event streaming has three layers – the Confluent SaaS platform, which provides required scalability, performance and resonance, other SAS platforms such as Palo Alto, Sumo Logic, GitHub, Alation and Azure, and what the engineering platform has engineered,” Walia said.

Kmart emphasised how working closely with Confluent enabled successful operationalisation and developer community support. Kmart and Confluent have customer success reviews every quarter to discuss mutual roadmaps and success metrics, office hour meetings every fortnight to discuss updates and enable engineering teams, platform capacity and performance, support tickets and more, and on-demand tech talk sessions for knowledge sharing and enablement.

“Kmart has over 200 million+ events per day so we have configured the Hamilton platform with Confluent Cloud to achieve production, non-product and development environments.”