Regulation: The Harbinger of New Technology

Many of these new rules have gone through long and complex drafting processes and are only now reaching implementation phases. The final version of Dodd-Frank, for example, was ratified in the US in December 2013, after protracted negotiations lasting four years when the Volcker Rule was passed. Central clearing of derivatives under the new rules came into play in July 2013.

In Europe, meanwhile, the European Market Infrastructure Regulation (EMIR) will begin to come to fruition in 2014, with the first implementation deadlines arising in mid-February. Again in Europe, the evolution of the second generation of the Markets in Financial Instruments Directive (MIFID) continues. Meanwhile, the Solvency II capital adequacy directive for the insurance sector was ratified by the European Parliament (EP) in late 2013, so insurers and their asset managers now have a final set of rules with which they must comply.

A commonality among these new rules is the increased need for reporting – both to clients and regulators and hence also to management. This requires companies to have a firm grasp on the data they use. It is hardly surprising that internationally regulators are becoming increasingly interested in the role of data and enterprise-wide data management. A well-managed, clean, central source of data is essential to complying with all three pillars of Solvency II, for example. Industry-wide initiatives are underway in tandem to remedy counterparty risk issues, such as legal entity identifiers (LEIs), which give a consistent code to every entity involved in a financial transaction to make them easily identifiable at speed – critical if they collapse or run into financial difficulty.

Sophisticated and robust technology is essential for companies to manage risk effectively, handle data, and operate in current conditions. However, for many financial services companies, their technology platforms are woefully outdated and unable to cope with the demands of modern investment markets, let alone the scope of new regulatory requirements such as EMIR and MIFID II and the data management and reporting demands they entail.

Weakness of Legacy Systems

Recent research has shown that as many as one in four investment management firms’ core business operations rely on systems that are infrequently updated, based on poorly-documented or obscure technologies and unable to provide a real-time consolidated view of the business;
so-called legacy systems
. These systems, many of which were developed decades ago, were typically designed to support only a limited set of department-specific functions and a narrow range of instruments, rarely including even simple derivatives. However, over recent years in particular, investable asset classes have hugely expanded; emerging markets securities and derivatives, for example, have become mainstream.

Many technology systems predate such shifts, and are unable to cope. Therefore many legacy systems have been supplemented with spreadsheets and similar tools, resulting in a complex patchwork of technologies that makes it very difficult to obtain a consistent, reliable understanding of risk – and without that, mitigation of risk is impossible.

The evolution of risk management requirements, based as it is on the need for companies to protect themselves against many forms of risk, is thus compounding the need for up-to-date, automated technology within financial services companies.

This is not a side-issue that can be buried in a subsidiary business function or the IT department. Legacy systems which cannot monitor risk properly can create internal issues that put the firm itself at risk. This can have devastating consequences. An investigation by Louis Freeh and his law firm (the Freeh report) into the collapse of MF Global in October 2011 – the largest Wall Street bankruptcy since the financial crisis – pointed out that ‘antiquated’ and ‘fragmented’ systems were major contributors to the company’s problems.

So given both the business and regulatory imperative to replace legacy systems, why the reluctance to change? There may be several reasons for internal barriers – the complexity involved in decommissioning old systems that are interwoven into company infrastructure, or the expected cost to change are often cited.

However, the perceived cost savings achieved by relying on ageing technology rather than replacing it with a state-of-the-art platform can prove to be a false economy, hampering both regulatory compliance and longer-term business prospects as essential operational processes become outdated and costly workarounds are needed. The SimCorp Legacy study references a global survey of buy-side firms conducted earlier this year, which showed that well over half of legacy system respondents (56%) have had to increase IT operation budgets overall. By contrast, 60% of respondents with state-of-the-art technology systems plan to maintain or decrease IT operations spend.

2014 will be a milestone year from a regulatory perspective, with the implementation of several key directives and rules that have been years in the drafting. Companies that attempt to comply using a patchwork of manual processes and outmoded systems will find it impossible not just to meet the regulators’ demands, but to even compete in an environment where risk management sits at the heart of business.

17 views

Related reading