Over the past 10 years, front-end systems have attracted the lion’s share of IT investment. Low-latency, high-speed automation has been the ‘big-money’ game. Trading has gone electronic, international, multi-asset and cross-venue. As returns from commoditised long-only investments decrease, firms are looking to more complex trading and investment strategies in the search for higher yields. At the same time regulatory change is firmly on the agenda, and transparency and risk management have become the watch words of the financial markets.
Greater Data Demand
The amount of data needed by the average firm has exploded on every front. More venues, portfolios, customisation, indices and data-dependent asset classes have driven up volumes. Valuations are now needed daily or even intraday – not monthly or quarterly. Balance sheet information, sales reports, regional economic projections and staff track records are becoming as important as fundamental and technical data. Even if the big-name aggregators could provide all that, other new sources would still be essential to gain competitive edge.
Hedge funds have recognised this for years: they regard non-traditional information as a major asset. But what has really changed is the swell of operational complexity in processing these increased volumes. There is much greater demand for real-time understanding of valuations, exposures and risk.
Both investors and regulators want more transparency and proof that management has put adequate procedures, controls and risk checks in place – along with robust audit trails, operational oversight, and accurate and timely reporting. Regulatory arbitrage is out of the question; demonstrating a consistent approach, whether to pricing or risk management, is unavoidable.
The Problem of Volumes and Static Solutions
In short, firms have to get more data, do more with it, more often, and in a shorter timeframe. It is no longer something that can be avoided, ignored or delegated down the chain of command. A culture of data governance is needed to create robust processes around every aspect of data sourcing, selection and deployment, and to make sure they are adhered to. And just like corporate governance it needs to go all the way to the top of the organisation.
But here’s the first problem: fundamentally, investment systems were never built for this volume of data and complexity. Risk management, trade processing or accounting platforms weren’t built to cope with the daily information onslaught. After a decade of spending on the glamorous front end, the investment focus has to switch back-stage to data management solutions that are needed to power trading, risk, compliance, modelling and accounting platforms.
The second problem, which is by far the bigger of the two, is that the one-size-fits-all, static solution that many vendors prescribe will not solve all data-related problems across the enterprise. These are not solutions; they are another problem waiting to happen.
From One Size Fits All to Fit for Purpose
Any data management infrastructure has to be appropriate for the size of the firm and the type of operation. The solution that is right for a 40-person hedge fund is very different from the solution needed by a global custodian with thousands of customers and tens of thousands of employees. Most firms have multiple business units, product lines, and investment strategies – all of which require different data sets used in different ways. Compliance and risk management will need different data sets than the trading desk. Operations want data on actual holdings, while analysts use it for modeling, stress-testing and ‘what if’ scenarios. Clearly there’s no single solution for these distinct requirements.
Of course it is critical that every department operates from the same accurate, verified and unified data source. You can’t have accounting and trading working with different numbers. Data is now a strategic concern with a large number of touch points across the business whose requirements are changing much more quickly than ever before. The idea that you can impose a monolithic data management structure, with a single data set and a single management tool, onto the modern business with all its complexities is manifestly flawed.
With too many objectives, such a system is over-reaching, over-ambitious and over-complicated. It takes too long to implement and only solves the problems faced at the beginning of the project – not those at the end. No wonder it has failed so often. With data volume on an irreversible upward trend, it’s only a matter of time before such solutions collapse under their own weight.
Finding the Right Balance
In the real world, different parts of the firm have different IT infrastructures and operational structures and are trying to solve different problems. Where one wants to expand the asset classes or the geographical reach of their products, another will need to address compliance issues or reporting challenges. Rather than trying to solve everything at once, it is more logical to take a prioritised approach: address immediate business problems, get the data right, deliver returns and move on to the next challenge.
That requires a degree of pragmatism that is often missing from the world of data management. Meeting different business requirements demands a dynamic, federated model that brings together key enterprise data in a centralised environment, while also enabling individual units to own their data rules, mandates and preferences. Instead of relying on centralised control alone, data is distributed and made available in a truly actionable and accessible form.
However, it is critical not to lose sight of the broader issues. You can’t have lightweight, siloed systems that will not stand the rigors of today’s volumes – and tomorrow’s. You need much more flexibility and scalability than that.
Keeping Sight of the Long-term
Vendors need to engage with their customers and understand their problems and their long-term strategy, but few do. These are not systems to be changed every five years; rather, they are fundamental to business infrastructure and its ongoing success. A down-and-dirty implementation of an out-of-the-box system might look like a quick and easy solution, but it isn’t – it’s just sticking plaster.
This is no time for inadequate solutions and temporary palliatives. There’s no question that volumes will keep growing, regulatory pressures will increase and operational complexities will multiply. It will continue to become harder to mine, manage and make data useful without significant investments in automation. However, accurate, accessible and actionable information will remain essential for gaining competitive advantage and higher returns in today’s trading and investment environment. This is why it is time to call out old-fashioned ideas about data management and expose them for the myths that they are.
When Mark Cuban declared that "Data is the new gold" he highlighted why information is possibly the most valuable asset a business has. APIs are the unsung heroes that make it possible to extract that value.
How treasury stands to benefit from blockchain: Ripple’s goal to revolutionise cross-border transactions
Imagine a world where cross-border transactions can occur in real-time, at a few cents per transaction, to and from any bank, in any ... read more
Europe’s opening banking regulation is finally here. After months of preparation across the continent, the Revised Payment Services Directive comes into effect on January 13.
The revised Payment Services Directive regulation, regarded as one of the most disruptive in Europe’s financial services sector, will begin to make an impact on January 13, 2018.