Throughout 2009, the phrase ‘escalated to board-level’ arose in most market data conversations. This was inevitable, given market data is renowned for being the second-highest cost (as a single item) for market participants, and with budgets coming under intense scrutiny. Market data is the lifeblood of the financial markets, and minimising latency in its delivery is a key differentiator, although this can be costly to achieve, particularly for smaller players.
In the current climate, cutting costs has never been a more pertinent issue. With volumes of market data skyrocketing – in part due to high-frequency trading and increased fragmentation of liquidity across venues – both sell-side and buy-side institutions need leaner and more efficient market data infrastructures to drive out redundant costs and meet continuously changing market requirements.
M&A – A Driving Force?
It was believed that the economic crisis would result in a flurry of M&A activity in 2008/9, as financial institutions looked to increase market share through the acquisition of their weakened counterparts. But the economists got this one wrong – deal volumes dropped, along with deal sizes, as uncertainty levels rose, and government regulations were brought into force. However, the tide may well be turning on this trend; according to recent research by PwC1, analysts are predicting an increase in deal activity in Europe in 2010. Consolidation will inevitably result in the evaluation of infrastructure, in order to rationalise or replace technology in the most effective manner.
Driving Down Cost – Evaluating Existing Infrastructure
All too often in the past, a financial institution’s strategy to upgrade its infrastructure has been based on the age of the current technology, and a belief in ‘spend now, save later’. But today, with a closer eye on cost, and a ‘do more with less’ ethos, infrastructure is being ‘trended’ to see just what the system is able to handle, and if an upgrade is really necessary to handle the predicted levels of data traffic.
Market data is typically charged on a per head basis; in light of recent headcount reductions at most institutions over the past 18 months, the upkeep of profiles is often neglected. This means that an institution may be over- or under-reporting data subscribers, not paying the correct fees, and be in breach of exchange and data vendor licensing agreements.
When evaluating infrastructure, the key intelligence is in benchmarking against comparable peers. As this information is not readily available, it is usually only through the use of a third-party consultant that such analytics are provided. The market data cycle needs to be assessed in a real-life trading environment – not in an isolated, static simulation – to take a holistic view and evaluate the speed of the entire system, end-to-end.
Back to the Future – Managed Accounts
Since the late 1990s, banks and financial institutions have taken the approach of bringing as much operations in-house as possible, in an effort to retain control and with the false hope of minimising cost. However, the industry is currently undergoing an attitude shift, perhaps due to a generation change, with an increasing number of financial institutions adopting the outsourced model – enabling them to refocus on their core competencies, which have never included IT.
More advanced technology, with heightened security and connectivity, is the enabler for this shift, however, undoubtedly, cost-cutting is the catalyst. While the Tier 1 institutions, with the highest consumption of market data, can demand the best deals from suppliers, the smaller institutions, such as Tier 2 banks and hedge funds, don’t have this power. Through outsourcing, these smaller players can see the benefits of economies of scale that would otherwise be unattainable.
Cost reduction, achieved through shared infrastructure and support services, is the predominant, initial reason an institution chooses to outsource its market data, however this is only one of many tangible benefits. Others include:
Speed to market: Start-ups, looking for rapid market entry (often with a narrow window of opportunity to capitalise on a differentiator) are keen adopters of the managed service model. Due to a lack of expertise in IT, legal and systems integration – and a lack of interest in becoming experts – these start-ups, most notably hedge funds, want to focus solely on generating alpha. A new infrastructure can be delivered and be operating in days, not months.
Scalability: Market data provision can be scaled according to the need – the ‘we grow with you’ model. This is also effective in reverse – as an organisation needs to reduce its market data consumption, this is more easily achieved with a managed service.
Future-proof: Continually upgrading to the newest and fastest technology, to chase ever-lower latency, is a process only the biggest institutions can commit to. Upgrades involve development, integration, testing and deployment, by experts. An independent market data-hosting provider can undertake these upgrades themselves, and its clients reap the benefits of using leading edge infrastructure.
Expertise: Handing the management of market data to seasoned experts is one of the key benefits gained through outsourcing. Teams of engineers and consultants are available on-demand, not on the payroll. Therefore, if a problem arises, or the scope of work changes, these experts who have experienced most situations, are on hand.
Resources: Market data management is a labour-intensive, ongoing process that, if carried out in-house, requires market data architects, engineers (application and network), support staff, administrators and application developers.
While some financial institutions – with inherent control issues – will continue to shy away from outsourcing, the tide has unquestionably turned on attitudes to, and adoption of, managed services. As the M&A activity in the financial services market picks up once again, we will see even more firms turning to outsourced experts, for non-core functions, including the optimisation of market data delivery.
Prevention is better than cure, and the importance of evaluating market data infrastructure to optimise usage is key for organisations of any size – from the biggest to the smallest players and hedge funds. Even for those that do not need ultra low-latency market data (such as custodians whose use of market data is for closing valuations), it’s important to review the existing infrastructure – and a health check can quickly assess if it will continue to meet their needs even with growing volumes. Regardless of strategy, market data is, and will continue to be, the lifeblood of financial markets.
1PriceWaterhouseCoopers: European financial services M&A insight, October 2009
We have been witness to a series of significant security events recently around payment execution, from Leoni in Germany through to ABB in South Korea and SWIFT in Bangladesh to name a few of the major headlines.
When Mark Cuban declared that "Data is the new gold" he highlighted why information is possibly the most valuable asset a business has. APIs are the unsung heroes that make it possible to extract that value.
How treasury stands to benefit from blockchain: Ripple’s goal to revolutionise cross-border transactions
Imagine a world where cross-border transactions can occur in real-time, at a few cents per transaction, to and from any bank, in any ... read more
A decline in the return on capital employed of globally listed companies over the last decade has been noted in recent EY and PWC reports. This is despite businesses taking an increased focus on balance sheets since the financial crisis in 2008.