Big Data analytics is becoming increasingly important for both small to medium enterprises (SMEs) and larger corporates. This trend is only set to continue, with US researcher International Data Corporation (IDC) predicting in its latest that businesses globally will increase their spending on data analytics by 50% between 2015 and 2019. This will take the total spend up to US$187bn.
With this huge spending increase, however, we haven’t seen an improvement in the management and use of analytics. A recently-released report from Mu Sigma looking into the state of data analytics in large organisations found that there are really no hard and fast rules when it comes to who is responsible, and how to approach analytics. There’s agreement that business leaders see analytics as benefiting business growth – with two in three agreeing – but in many cases it still sits below other functions like finance and corporate IT in terms of overall priorities.
What’s clear is that businesses are making a good start when it comes to using data analytics but they are missing out on many opportunities. The confusion around who is in charge of data, how and why to collect it, and what to do with it once collected is holding many large enterprises back from making the most of it.
As for what’s behind this, the following identifies a few root causes for the problems when it comes to data analytics in large enterprises:
Data or desired outcome: the chicken and egg problem
Three in four of the companies covered by the Mu Sigma survey collect data, then work out what to do with it. Only one in four establish a problem first, then collect the necessary data needed to help solve it. At a time when the rate of business change is faster than ever, too many organisations are focussing too much of their time on ‘what’ is happening in their business rather than focusing on the critical problems of ‘why’ and ‘what next’. It would appear too many are guilty of collecting data for data’s sake, not because they’ve found a use for it.
Governance model: one size doesn’t always fit all
The analytics model used within large enterprises varies wildly, with a centralised model being the most common. Forty-four per cent of businesses have a single unit that provides the analytics for the entire company. This is seen most often in companies where the chief information officer (CIO) owns analytics. Less common is a decentralised model, with each individual department being responsible for its own analytics. The final, and least common option is a federated one: a combination of the two. Despite this being the least favoured model, it does provide the flexibility to give individual units power over their own decisions, while retaining control as a business.
Who’s in charge here?
There is also a diverse landscape when it comes to the person in charge of analytics, which can have a profound impact on business success. Specialists, such as chief data scientists, chief data officers and chief analytics officers only make up one quarter of the companies we surveyed. Without a specialist role, data analytics might fall into the camp of the CIO, chief financial officer (CFO) or chief marketing officer (CMO). Sometimes, it is spread between these departments, only adding to the confusion. Furthermore, if proof were needed for having a specialist in place the research shows that successful firms – those that have exceeded stakeholder expectations – are three times more likely to have a chief analytics or data officer at the helm.
Elements of a top performer
In conclusion, of all the organisations that took part in the research, over half said they still haven’t got a clear roadmap of analytical business problems for the year ahead, or there’s room for improvement. For those who are looking to address those shortcomings, the research identified four elements shared by the top performers:
1. A formal governance structure with someone at the top accountable, but individual departments given the power to run analytics too.
2. They measure not just the data, but the success of analytics to make sure they’re not collecting data for data’s sake.
3. A clear analytical roadmap, which reflects the connection between departments; they start with a problem first.
4. Communicate insight across the business – not just at board level – making it more actionable and consumable.
This might sound complicated, but it’s really just about open communication, and, as is often the case, remembering that a combination of approaches might be needed for a large interconnected enterprise.
* GTNews will be publishing further articles on the topic of Big Data in October 2016.
Tim de Knegt, treasurer for the Port of Rotterdam, discusses how he is looking to bring more value to the Port's clients using blockchain.
Regulation technology is fast gaining currency by transforming how financial institutions can tackle compliance in a swift, comprehensive and less expensive manner.
Many banks around the world, large and small, continue to experience major security failures. Biometric systems such as pay-by-selfie, iris scanners and vein pattern authentication can help.
The implementation date of Europe's revised Markets in Financial Instruments Directive, aka MiFID II, is fast approaching. Yet evidence suggests that awareness about the impact of Brexit on MiFID II is, at best, only patchy and there are some alarming misconceptions.