The global financial crisis of 2008 highlighted a lack of understanding on the part of banks, investment managers and corporate treasury departments about the legal structures of their counterparties. In particular, the collapse of Lehman Brothers demonstrated that if legal entity information had been more readily available and trusted, the failure of the entire supply chain across financial institutions (FIs) could have been lessened, or even avoided.
This experience has resulted in a regulatory initiative, endorsed by the Group of Twenty (G20), known as the Legal Entity Identifier (LEI) initiative. The LEI, a unique identification system for parties to financial transactions, is intended to help regulators and market participants monitor and mitigate the build-up of systemic risk in the system. Furthermore, it will have business benefits too. Post-crisis, the lack of standardised identifiers for entities, coupled with the absence of an authoritative source of legal hierarchy structures is driving up the cost of doing business at a time when institutions can ill-afford further demands on their bottom line.
Implementing the updated LEIs must start with accurate and compliant data, which is fast becoming the holy grail of the financial markets. The sheer volume and questionable quality of data that exists in the financial markets is causing both reputational and operational harm. Indeed, according to a study by the Bank of International Settlements (BIS), four of the top seven reasons for failed settlements are linked to flawed counterparty data.
Organisations currently use a variety of methods to gather, organise, and manage entity identifier information for the huge number of defacto standards that exist today. This inevitably leads to duplicates, inconsistencies, and erroneous mappings seeping into the data. Records, over time, become stagnant and the processing of actions becomes increasingly complex. This is exacerbated by the fact that entities – and their relation to their own subsidiaries – may have undergone multiple changes in their legal structures, which have not been updated in the information currently held by corporates, leading to further data irregularities and heightened business risk. Together, these issues mean that data management has become costly and time-consuming and, too often, has failed to keep up with the new demands and challenges of the financial markets. Unwinding Lehman’s trade positions and derivatives after the September 2008 crash for instance, proved to be a nightmare.
Benefits of the Standardised LEI
The standardised LEI is now encouraging market participants to review their legal entity structures and is driving forward an emphasis on the need for clean, accurate data. For many corporates, preparing for the LEI will involve a data cleansing process that will undoubtedly throw up fundamental issues which have to be dealt with. For example, if duplications are found and exposures consolidated, this may lead to larger or smaller exposures than expected. A firm may need to trade its way into bringing the exposure to an acceptable level.
The anticipated result of the LEI initiative is that, eventually, all firms will require and register for an LEI. Obviously, those with clean data will spend less time and fewer resources on implementation and therefore, among the things that organisations should be focusing on now is examining the quality of their data in order to build a comprehensive view of their database, their product usage and requirements. To ensure compliance with the post-crisis regulatory measures that will require an LEI or equivalent, such as the trade reporting requirement in the European Market Infrastructure Regulation (EMIR), corporates will need to know the LEI of the parent company of the counterparty/beneficiary/debtor.
As the global LEI becomes more widely used, organisations should see cost reductions and improved risk management, both at the firm level and across their data systems. These savings would come primarily from operational efficiencies, such as reducing the volume of transaction failures, lowering data reconciliation and reducing regulatory reporting costs, while simultaneously providing better tools to carry out enterprise risk analysis; the importance of which cannot be overstated.
Many institutions have already begun the process of analysing their current data architectures, determining where entity data is present, matching their internal records to LEIs, and identifying data quality issues within their data. Those who move most quickly to ensure the accuracy of their data will emerge as the winners.
When Mark Cuban declared that "Data is the new gold" he highlighted why information is possibly the most valuable asset a business has. APIs are the unsung heroes that make it possible to extract that value.
How treasury stands to benefit from blockchain: Ripple’s goal to revolutionise cross-border transactions
Imagine a world where cross-border transactions can occur in real-time, at a few cents per transaction, to and from any bank, in any ... read more
Europe’s opening banking regulation is finally here. After months of preparation across the continent, the Revised Payment Services Directive comes into effect on January 13.
The revised Payment Services Directive regulation, regarded as one of the most disruptive in Europe’s financial services sector, will begin to make an impact on January 13, 2018.