Countdown to FSCS Compliance

The Financial Services Authority (FSA) has set up a new agency to respond to the need to pay out UK depositors urgently, in the case of insolvency of the deposit taker. The scheme pays out for amounts up to a maximum of £50,000 to retail depositors (PS09/11). In order to implement this policy, all UK deposit takers must provide information to the FSCS in the form of a single customer view (SCV) file that shows the aggregate deposits for each customer.

From 31 December 2010, deposit takers subject to the electronic SCV requirements must be able to provide the FSCS with an SCV file containing records of eligible claimants within 72 hours of request. Some types of accounts should be omitted from the SCV, for example, where the eligible claimant is the beneficiary of an account held on their behalf by another person, or if the account is not active. The SCV will facilitate compensation payments to eligible customers within seven days and in no more than 20 working days as required by the European Directive on Deposit Guarantee Schemes.


• 30 June 2010: a pre-implementation report is to be submitted to the FSA providing a status report on progress towards meeting the SCV deadline.
• 31 December 2011: SCV implementation deadline – deposit-takers must adhere to the new SCV regulatory requirements.
• 31 January 2011: deadline to deliver an implementation report to the FSA covering topics such as implementation summary, testing performance and maintenance plan.
• 31 January 2011: verification phase – an SCV file is to be sent to the FSCS no later than 31 January 2011.
• 24 July 2011: the FSCS to complete Verification no later than 31 July 2011.

In respect of an institutional failure, the following timeline will apply in the event that a deposit-taker is required to provide an SCV to the FSCS from 1 January 2011.

• Day 1: request for SCV file from the FSCS.
• Days 1-3 (calendar days): deposit-taker prepares and submits SCV file to the FSCS.
• Days 3-7 (calendar days): any FSCS payout using SCV file.
• Days 8-20 (working days): FSCS payouts for eligible depositors not in SCV file (eg client accounts).

The impact of producing a SCV cannot be underestimated. Banks and other financial institutions have continually been striving for such a file for the last 25 years or more, and only recent start-ups with few products on offer can claim to have anything approaching this view. A new SCV data regime is at the core of this legislation – 25 new data fields will be used to create an aggregate view of all deposits across businesses, products and complex customer relationships.

The resulting files will not only be used to ensure fast and accurate pay-outs in the event of an institutional failure, they will also be used in conjunction with other legislation, e.g. PS09/16 and PS09/20 to examine a bank’s liquidity and stress-testing capabilities. Ernst & Young1 estimate that the core cost of the SCV could be £431-539m. For the largest banks in the sector this equates to a multi-million pound project that has to be completed by the end of this calendar year.

Business impact

The impact in financial terms, then, is substantial. But where will the money go? What needs to happen in order to produce this SCV? For each deposit-taking product offered by the banking-licensee, which for the larger banks with multiple brands could easily be in the hundreds and in the tens for the medium-sized banks, every account has to be evaluated and the required personal details and account balance extracted to an interim SCV file. Every file will then have to go through a process of de-duplication and verification against the requirements to ensure non-required records are ignored and a single record, containing an aggregate balance per customer, is produced. There is then a further process of combining all product files, de-duplicating and consolidating the results into a final version for submission to the FSCS – not a simple task.

The volumes of data and the processing involved make this a large-scale project, combined with a very restricted timescale. Add in the necessity to produce an audit trail of actions to demonstrate compliance, and also for input as part of the FSA’s pre-implementation and implementation report deadlines, and the scale of complexity and urgency are subsequently increased.

Heart of the Issue for IT

What is at first sight a fairly simple requirement for the business becomes a sizeable problem for the IT function, on top of all the other business-led requirements and organisational changes that are being worked on in 2010. Breaking it down into components, at the heart of the project are the key elements – successful data migration and data manipulation.

Across any financial institution there are multiple sources of data – databases from acquired brands, product databases, name and address databases, in fact many silos of information. All records with any kind of customer account balance field in them will need to be accessed and data migrated from the source database into the SCV file required by the FSA and the FSCS. In between that initial data extraction, there is data migration, processing, cleansing, tagging, linking and storage. These all add cost and complexity to the project, while also possibly requiring access to and use of those ‘golden’ time-slots where databases can be taken offline without interrupting the accessibility to customers and internal systems.

Why is data migration such an issue? Historically, the problem of data migration has largely been underestimated in terms of its complexity and therefore its impact on the success of a project, and is not normally addressed until the project is well underway. And, like testing, to a certain extent, the window of opportunity to get it right reduces as the early elements of projects over-run and the pressure increases to deliver a solution to the business on time and within budget. Each project tends to view its data migration needs as a one-off, and therefore addresses the requirements in an ad-hoc way, creating duplication and further silos of information. This approach historically uses a one-way ‘big bang’ process, which increases risk of failure, risk of over-runs and a complete lack of synchronisation with data held elsewhere in the organisation.

In fact the problems, if not the fears, surrounding data migration are perceived to be so great that many application renewal, migration and transformation project decisions get deferred because of concerns over the data migration aspects of such projects. In 2006, for example, a survey of UK-based financial services firms in 2006 found that 72% of them considered data migration to be ‘too risky’. In other words, and especially for mission-critical and 24/7 applications, the potential risks of time and cost overruns in data migration meant that many companies preferred to delay their application plans, ultimately to the detriment of the business.2

Traditional approaches to data migration – such as extract, transform and load (ETL) and scripts – have their roots in other industries and are not suitable for handling complex data migration scenarios.

They were not designed to manage and migrate data in today’s 24/7 application world, where applications are closely linked to business processes and data is used across multiple environments by many parts of the organisation. This leads to data migration projects that:

  • Disrupt business processes – resulting in poor service, revenue leakage and high costs.
  • Enable low or no business agility and inflexible IT migration strategies.
  • Cause delays and spiralling costs for transformation programmes.
  • Require high OPEX and unrealised CAPEX return on investment (ROI) in application investments.

It is time to improve the way data migration is handled within the organisation and to look at how the processes and tools in use can be evolved and enhanced to increase the success of any data migration project. Naturally, this approach should also take into account how the risk of failure can be reduced to a minimum and how the business requirements can be fully supported.

There are three main risk factors associated with data migration:

  • Understanding your existing data adequately.
  • Ensuring that data relationships don’t break during the migration process.
  • Assuring a smooth handover process at the completion of the project.

By taking the appropriate steps at each of these points, you take the risk out of the migration process.

Business can Reap the Benefits Too

One of the key activities for 2010 in the banking sector, as identified by Tower Group3 and other industry analysts, is the need to rebuild trust with their customers. A clear demonstration of the ability to meet the FSCS timescales can be used to demonstrate to customers their capability, expertise and commitment to providing a superior customer experience with their organisation.

Much has been made of the SCV as the Nirvana that financial organisations have been striving to produce for some time. Impending legislation will drive this requirement faster and data migration is again at the heart of meeting compliance.

One particular advantage of being able to reproduce this data migration quickly, perhaps in subsets of customers in geographical areas, is that marketing and product development team can gain a better understanding of customers and therefore develop closer relationships with them. For example, depositors, mindful of the £50,000 compensation limit for each licence holder, may spread their investments. If the threshold is raised in the future, a financial organisation could be the first to offer more services/products to those on or close to the limit, demonstrating understanding and delivering superior customer service.


The forthcoming FCSC requirements are another burden on top of an already large number of compliance-based projects that are being addressed within financial services. Although, on the surface, the requirement seems to be fairly simple, the truth is that the complexity of the IT infrastructure, the plethora of silos of information that need to be accessed and the lack of suitable data migration environments available to ensure success mean that this is yet another headache in the making.

However, by adopting a different approach to data migration as described above, the project can be achieved in a way that will both minimise risk of failure and produce a demonstrably compliant and audited solution. This can be achieved using a phased approach to maximise confidence in the solution and to deliver early results and benefit (including ROI) to the business.

Beyond this particular project requirement, there is the ability to maximise the reusability of the solution to deliver further business and technical benefits. These benefits include the ability to enable business agility when faced with migration requirements and challenges, realigning IT more closely with the business and generating an even higher level of positive ROI over a shorter timescale than any of the alternative approaches available in the marketplace.

1 Ernst & Young Fast Payout Study, November 2008.

2 White Paper Bloor Research white paper: ‘Business-centric Data Migration’ 2009.

3Tower Group – 2010 Top 10 Business Drivers, Strategic Responses, IT Initiatives in Brokerage and Wealth Management.



Related reading