Kafka Streams applied to the OpenGaming aggregation platform

Written by Mark Andrews

Stream Processing is a powerful and modern data processing paradigm
which aims to encourage the continuous flow and processing of data. This blog describes how it applied to part of Light & Wonder’s (L&W) igaming content aggregation platform to improve scalability, testability and improve spin times.

What is Stream Processing?

Under the Stream Processing paradigm, data is described as in motion. This idea is best contrasted against a relational database where data sits at rest. At its simplest, Stream Processing seeks to replace periodic batch processing of historic data with real-time processing of a continuous flow of current data. At its most complex, it supports the real-time interlacing of data from multiple sources to drive intelligent data insights.

A Stream Processing architecture is comprised of:

  • One or more producers of data.
    • Data is often best conceptualised as an event, for example: a successful casino wager or a player depositing funds
  • A platform which supports the streaming of data
  • One or more applications which consume and process data from a stream – this is where Kafka Streams comes into play!

Kafka versus Kafka Streams

Before we wade into Kafka Streams, it is important to mention an important prerequisite: a Data Streaming Platform named Kafka.

Kafka is a popular open-source event streaming platform which is already in use within L&W iGaming . It is a reliable and scalable platform which can support a high throughput of data across multiple streams. Within Kafka an individual stream of data is modelled as a topic:

Kafka Streams is a client (Java) library for processing and analysing data stored in Kafka. It is dedicated to building event stream processing applications which integrate with Kafka. As Kafka denote on their website, it builds upon important Stream Processing concepts such as properly distinguishing between event time and processing time, windowing support, and simple yet efficient management and real-time querying of state. Its aims are to:

  • Simplify the creation of real-time data processing applications via its powerful and precise API.
  • Support real-time joining and aggregation of multiple data streams on a per event basis.
  • Provide a precise integration with Kafka offering distributed, scalable and fault tolerant data processing capabilities.

Kafka Streams – A Real World Example

The core of the Light & Wonder aggregation platform (Open Gaming System aka OGS) is a prime candidate for this shift in paradigm. Start treating wagers and results as events, envisage that each of these events are persisted into Kafka rather than a database and we suddenly have our transactional OGS data as an event stream ready for processing!

Additional processing, joining, and aggregating of these events can now be completely decoupled from the event itself. This opens a world of exciting new possibilities, as well as allowing us to streamline existing applications, an example of which I describe below.

Currently, to process a wager or result in OGS, the company depends on a database to carry out most of the initial online processing of the data. The logic executed to process and persist this data is loaded with complexity as it also aggregates wagers and results into game round summaries. This complexity is expensive and must be executed before the wager or result is deemed successful in the OGS system.

An approach incorporating Kafka Streams would see the initial online processing of a wager simplified into an event being produced into a Kafka topic. In this scenario, the aggregation of wagers and results into game round summaries would be delegated to a Kafka Streams application. This will allow for offline, yet still real-time, aggregation of wagers and results into game round summaries

The benefits of this are:

  • Quicker initial online processing of wagers and results.
  • Reduced load and complexity at the database level.
  • Greater flexibility and improved maintainability of company game round summary aggregation logic; and
  • Future proofing the availability of real-time wagers and results data for new use cases.

Watch this space for future news, as L&W continues to create better and develop our event streaming and processing capabilities.

The opinions expressed in this blog post are strictly those of the author in his personal capacity. They do not purport to reflect the opinions or views of Light & Wonder or of its employees.