Chris Weston
Chris Weston (Principal, European Client Advisory)
Marc Dowd (Principal, European Client Advisory)

As most organizations begin to prepare for a new operational reality where issues such as resiliency, agility, cost awareness and governance will be key driving factors, there is a renewed focus on the use of data to drive decision making and process improvement. In principle, data is a valuable asset, however, for many organizations collecting large amounts of data has turned it into a liability and organizations often struggle to balance the business benefits and technical challenges associate with their Big Data and Analytics initiatives.

For the past 2 years, IDC has played a key role in an EU-funded project to develop a set of business and technical metrics/KPIs to help organizations benchmark both the business outcomes and technical elements of their Big Data investments. The overall goal of the project, Databench (https://www.databench.eu/), is to provide European organizations with a framework and set of tools to integrate business and technical benchmarking approaches for Big Data technology investments.

As part of this, the team has surveyed over 700 European organizations across various industries and sizes on their Big Data investments, developed over 700 qualitative use cases across these industries, examined the progress of over 40 large-scale big-data technology projects, and carried out an in-depth analysis of specific use-cases with organizations such as Whirlpool, Intel, CaixaBank and the European Space Agency.

In general, those who have invested in Big Data technologies have achieved significant improvements in key business metrics such as revenue, costs and profits. While performance varies across industries and use cases,  a 4% – 8% improvement in these metrics on average is what we are seeing. When viewed in terms of more qualitative metrics (e.g. customer satisfaction, service quality, business model innovation) we see many organizations achieving improvements in the range of 10% – 24% which is a major achievement.

Technical Benchmarking Should Not Be Ignored

However, these is also a clear warning on how failure to adequately address the technical elements of the initiative early on, can seriously reduce the overall effectiveness. Most organizations believe that technical benchmarking of their Big Data initiatives requires very specialized skills and is inherently expensive, so few have carried out extensive or accurate benchmarking. In this context, cloud solutions for Big Data tend to be preferred as they are deemed to provide easier access to a wider set of technologies. That said, these organizations acknowledge that the variety and complexity of available Big Data solutions bring with it several key risks:

  • The risk of choosing a technology that over time, proves to be non-scalable either technologically or economically
  • The risk of Cloud vendor lock-in and the risks associated with migration to another platform
  • The risk of Cloud services being more expensive than expected, particularly when scaled-up, and technology expenses outweighing business benefits

Real-Life Cases Undermine The Complexity

During the course of the project, a number of real-life implementations were examined to gain better insight into the challenges of balancing business outcomes with technical requirements. Two of these cases, from the retail sector, provide a clear indication of just how difficult this balancing act can be:

Case 1 Intelligent Fulfillment: A retail player implemented a system using a complex AI and Machine Learning system (sales prediction) to optimize the assortment selection and automated fulfilment at an individual shop level. The system was piloted in one shop and delivered a 5% increase in margin, which was the equivalent of about €5 million annually. Despite this, the project is currently on hold due to the economics of scaling up the initiative. The system was run on Apache Spark in a commercial cloud environment, but when scaled across hundreds of product categories, shops and system runs, the total cost to achieve the optimized assortment outweighed the business benefits.

Case 2 Online Recommendation System: Another retail player developed a system to provide personalized recommendations at an individual customer level. The system was designed to increase margin through targeted upsell and cross-sell recommendations. When trialled, the system was seen to generate a 3%-4% improvement in margin. However, across all product lines, the data tables required to support the cross-sell recommendations became too large to be deployed on an on-premise hardware appliance. To address this, the retailer simplified the targeting to the client segment level rather than individual client level – but these less personalized recommendations were seen to generate 90% fewer clicks by customers on the products being recommended, resulting in lower upsell and cross-sell success.

To underline the benefit of effective technology benchmarking and how this can make a significant difference to the overall success of an initiative, for each of these use cases a comparison was made of the relative cost of the cheapest and most expensive optimal machine configuration with a major cloud provider. In both cases, the cheapest option offered savings of over 70%. This level of saving could change the dynamics of the business case and in some cases, make the difference between scaling-up and shelving the project.

Star Performers Think Differently

We have also identified over 30 organizations that we perceive as “Star Performers” – those organizations that have achieved above average results from their Big Data initiatives. What will likely be of most interest to CIOs and business managers is what sets a Star Performer apart from the rest and what do they do that others do less well or not at all. This is what we have found, Star Performers:

  • are more likely to be large organizations
  • are generally more mature in their Big Data usage and can often be seen as “early adopters”
  • set more ambitious business goals and have higher tolerance for risk
  • place more emphasis on achieving a wider range of business goals but are very focused on the goals that are chosen
  • do not perform any better than the rest in terms of using Big Data  and analytics for cost reduction, but perform better in terms of revenue and margin improvements
  • are prepared to implement technically complex architectures for sharing data outside the company
  • are much less likely to implement enterprise-wide data lakes, and much more likely to use streaming real-time data in combination with contextual data

Interested? Try The Self-Assessment

So if you are considering new investments in Big Data and Analytics and are puzzling over some or all of these issues, we have some great insights to share. Better still, try the Self Assessment Survey https://www.databench.eu/self-assessment-survey/ and see how you stack up against your peers.

Sharing