The majority of Big Data projects fail to meet their targeted return on investment – and it might even be a substantial majority.
There: I said it. And I do Big Data projects for a living. I work for a company that considers Big Data and analytics to be essential to the success of every enterprise embarking on a digital transformation journey. But the fact remains that most Big Data projects don’t generate the kinds of ROI their architects envision and their C-suites desire.
There are loads of reasons why this is true. At the easy end of the spectrum is unrealistic targets. If you decide that Big Data is going to double sales or triple your valuation, you have almost certainly set your sights too high. At the more difficult end of the spectrum is hoping that Big Data can rescue an unclear or misguided enterprise strategy, but even a fantastic Big Data implementation can’t make up for bad strategy (though it might help).
However, I believe the biggest factor determining ROI on Big Data projects is cultural. It has to do with how people think of data within the enterprise, and that’s a result of this simple dynamic: We have moved from a world in which data was a relatively scarce resource to a world in which data is a massively abundant one.
Data scarcity
Think about it this way: In the old days (and the old days were not that long ago), most data was generated by an enterprise’s existing systems. Data was limited, both in terms of type and volume; most data took the form of structured database entries: customer names, addresses, account numbers, orders, service calls, etc. A company’s IT team would capture and store that data, and business users would dream up the reports they wanted, send their requests to IT or the database manager and wait. It might days, it might take weeks, but eventually they’d get reports back from IT or the database manager. Then they’d repeat the exercise.
In these old days, data was mostly inaccessible to the people who actually needed it most: managers and marketers trying to make the best, most informed decisions possible. This made data scarce. And for big companies, global ones, inaccessibility was compounded by different divisions and different businesses collecting slightly different bits of information in slightly different formats.
Then something happened with respect to data: It became abundant – sometimes it seemed too much so. The cost of computing and storage went down (dramatically). The volume of data went up (also dramatically). Now, in addition to all the traditional data like customer records and accounting information, enterprises began to collect massive volumes of unstructured data from clickstreams, social media, and third-party information vendors. In short order, data moved from relative scarcity to overwhelming abundance. This was the Big Data explosion. And that phenomenon demanded new approaches to data (data centers and data warehouses and data lakes and scalable cloud storage) and data analysis (a fantastic array of analytical tools and approaches that focused on speed and flexibility).
From a data user’s point of view, this is the best of times; from a data user’s point of view, this is the worst of times. The best because a potentially valuable resource is more valuable than ever before: Data at speed is a huge source of competitive advantage. The worst because a data lake is deep and it’s harder than ever to find what you need, or even to understand what you need to look for. Every data user knows, or suspects, that data assets are woefully underused.
To the enterprise, the state of data today is like being at a buffet in Las Vegas: You’re overwhelmed by where to start – not just overwhelmed: you’re paralysed. It’s the tyranny of choice on steroids.
Changing the mindset
What should you do? You should change your mindset and acknowledge that the data revolution is about pervasiveness and access, not restrictiveness and control.
One of Brillio’s clients has taken this mindset change to heart. A global giant with enormous volumes of historical data, our client has a data lake that’s getting bigger all the time. This enterprise also had in place a typical division between the IT department and business units around the world. But as they realised the potential value of their data resources and the pressing need to be flexible and fast, they embraced the new pervasive mindset.
This led them to create a new organisational unit for data analytics. They’ve given executives from around the world the ability to self-serve their data needs. They’ve broken down the siloed approach that separated businesses and their data. Their enterprise data belongs to anyone and everyone who needs it (subject to compliance and regulatory constraints, of course). The progress is already apparent in terms of speed: It used to take 15 to 20 days for a user or team to get access to a specific dataset; now, they can get it in 10 minutes. This is a huge productivity advantage for our client, giving their managers more reasons to probe the data and see what it might deliver, more opportunities to focus on insights and actions.
In today’s data-rich, data-driven world, the new mandate is to get data out of the IT universe and embed it deeply throughout the enterprise from end to end. Understanding data and applying it to every part of an organisation’s structure becomes everybody’s business.
Put another way: Data isn’t fodder for a turf war or a tool of control, and it’s not the sole purview of the IT department. Data is a massively shared resource that fuels collaboration, catalyses insights, and generates opportunity. Your people should be applying their talents to understanding data’s enormous potential to reveal hidden patterns and unveil new possibilities. Data and related analytics provide long-term insights and trustworthy wisdom.
When you change the way you think about data, you change the way you use data – for the better.
Naresh Agarwal, Head, Brillio Data Practice
Image source: Shutterstock/wk1003mike
No comments:
Post a Comment