There has never been a more interesting time in the world of data and it is now seen as the ‘new oil’ for the 21st century digital economy. Comparable to oil, data has an immense potential value and offers huge rewards to those who learn how to extract and use it. In contrast to oil, however, the volume and variety of data continues to explode. In fact, in the last 10 minutes alone, we created more data than we did from prehistoric times up until 2003.

Ultimately, businesses that deliver usable analytical solutions outperform their peers financially by 20%

For many case management professionals, data analytics has traditionally centred on the data warehouse; a single data tank servicing the information needs of the organisation. The key goals of this approach were to create a single repository for historical data, improved quality, reduced information silos and to provide insights into historical trends through prescriptive reports and dashboards.

Although these are worthy goals, there are some major downsides to this approach. The most critical issues concern latency, usability and the overall complexity of building the warehouse itself. Typically, this is a long, drawn-out and expensive process; taking on average 6 months to deploy the first actionable artefacts.

The brave new world

In this brave new world we need data and a lot of it. Why? Well, simply put, traditional ‘analysis’ would focus on historical summarised data which made it extremely difficult to analyse trends across a broader spectrum of data and as a result, making it hard to derive valuable conclusions.

Related to this is the rapid paradigm shift from traditional data warehouse solutions to in-memory architectures and on demand cloud platforms such as Amazon Web Services (AWS) and Microsoft Azure. This new world not only offers rapid deployment but brings a toolbox of highly interactive user interfaces, predictive analytics and machine learning to the fingertips of users regardless of technical knowhow.

When we set about designing sharedo we also set about designing how our clients could best capitalise on their case management data. Rather than, as is tradition, treating case management and enterprise data warehouse as two separate entities; we set out to design a holistic solution. At the heart of this is our complex event processing engine. This is in effect our data plumbing and ensures that every context change with every case is understood. Each process and can be potentially analysed for cause and effect. It is this foundation that then enables predictive models to be put in place; the sort of predictive models that we believe can help our clients out perform their competitors.

It is at this current juncture where business intelligence ceases to be a technology task practiced by the few, but seamlessly interwoven into everyday applications and back into the control of the users that understand the data the most.

Breaking down these barriers is enabling a new breed of data exploration to evolve, known as predictive analytics, which is concerned with the prediction of future probabilities and trends. The central element is the predictive model, which can be trained over your data, learning from the complete experience of your organization within hours. As a result, claim professionals can now benefit from real-time data-driven-decision making (DDD), where intuition is supported by empirical evidence, to achieve successful, repeatable outcomes.
intuition is supported by empirical evidence, to achieve successful, repeatable outcomes.