Securing Financial Data

Bank and treasury IT systems remain among the safest places to manage, store and transfer data. However, a number of data breaches revealed over the past year indicate that even these highly secure systems aren’t completely fool proof. Last year, for example, one particular bank lost the customer data of Tesco’s in the post. In another incident, this time in Spain, 35,000 sets of Santander’s customer details were mailed incorrectly – with the wrong data finding its way into subsequent pages of billing information.

Experience of this type of event suggests that billing operations and ‘one-off’ customer communication present the highest risk of security issues. In the former case, it was due to the sheer volume of communications and the glitches that inevitably occur when operations and processes integrate with the realities of envelope stuffing and print shop applications. In Santander’s case, the “technical error at our printers” looks like the sort of one-off intervention to ‘buck’ routine processes and jobs, using data from otherwise well automated and secure financial systems. But it’s not a problem confined to an organisation or to this specific process.

Big Backup

The financial sector in the UK also hit the headlines in 2010 with record fines due to the loss of back-up tapes by an outsourced provider to one of the UK’s biggest banks, HSBC. In smaller treasury operations, this is less likely to be an outsourced process. Further down the food chain, many finance professionals still maintain daily backups to tape, and this particular technology is unlikely to lose its role in business continuity at any point soon. Tape is, of course, a removable medium, and by its very nature a serious risk that needs to be managed accordingly. The only way around the demand for removable medium is to replace it with a more convenient, but governance-amenable alternative. This needs to be flexible enough to get rid of ad hoc workarounds for outdated systems, as well as cater to the demands of routing workflows such as backup data. The lesson is that IT personnel might not see where people are getting around the processes or having to intervene in fiddly one-off jobs. If you’re in the real, operational world, they’re likely to be an irritating fact of life.

Criminal Intent

Simple theft remains a major issue, too. While not a treasury or financial episode, perhaps the highest profile event of the past year was the Wikileaks breach of US cables. This episode has certain similarities to the theft of 127,000 sets of customer accounts data by an IT specialist in a Swiss bank, whichwere then supplied to the French tax authorities.

The issue is not so much one of security as of – yet again – governance and business process. The design of these – the decision as to who needs to know what – has to involve financial decision makers as well as IT administrators. In light of the high profile failures we’ve seen, no one in their right mind would entrust live billing data to a tape, email attachment, USB or serial connection – encrypted or otherwise. However, in a large organisation, it’s likely only to be senior financial staff who have the necessary overview to appreciate – in simple terms – just who needs to see what.

The trouble is that humans are still the unpredictable factor in security. Financial operations professionals, therefore, need to have a clear role in deciding and setting visibility policies. Transparency is key and a sensible, compliant policy needs to be established.

The Cloud and Financial Data

Using the cloud for financial processes and data has raised serious issues and concerns. For many, the gains in cost and agility don’t seem to balance the risks in terms of uptime and security. The tide is, however, turning. In the context of government, for example, the process is already under way, with a cloud roadmap promised that will embrace even high ‘impact level’ (IL) data. In telecoms, vast chunks of billing data processes – and revenue – have already been committed to cloud providers and external data centres.

Yet the challenge will remain the essentially familiar one of ‘joining up’ domains and processes within and without a business. Whether we’re ‘joining up’ areas of different security levels in ‘mixed’ or virtualised data centres, backing up to a storage network or connecting a billing data mainframe, the problem is essentially one of secure, large scale application integration – what analysts term ‘enterprise application integration’.

The basic challenge of the cloud is taking data from the edge of a firewall – the fundamental piece of any security jigsaw – and guarding and governing it through the key processes, including data validation and transformation, so that it reaches its destination application safely and in the form intended – and vice versa. Integration and data transfer are in a similar position in the cloud, and service level agreements (SLA) with third parties that guarantee timely delivery of content are a proven way to shift burdens outside an organisation so that it can better deploy its resources elsewhere.

In theory, cloud providers should be treating data with the same level of security as implemented by any financial organisation and need to immediately inform the relevant person of any outages. In practice, many cloud providers’ systems, due to regulations and implications placed upon them, are usually more secure than the systems of many organisations they are dealing with.

Having identified the weaknesses that have come to light over the past year, conclusions can be drawn. The incumbent, and still appropriate, technology sticking together the various parts of the corporate financial jigsaw is one based on secured variants of file transfer protocol (FTP). Indeed there are thousands of legacy scripts running across financial institutions and businesses, often generated in-house and as much as 25 years old.

Is There a Cure?

Instead of relying on these legacy applications and the host of ad hoc workarounds based on mail and/or an unrelated collection of alternative transfer methods, the requirements we have repeatedly seen demand a single flexible system for financial data workflows that can eliminate systemic complexity and the need for risky ad hoc jobs. We need to eliminate the temptation to devise a workaround.

Ad hoc processes and ‘switchover’ events are a threat. The ability to integrate file transfer with existing or potential new business processes depends on a business case being conveyed to technology people by finance people. That’s not to say specifics aren’t important. For example, the ability of a data transfer system to be able to convert any file type to any other file type automatically is an important security feature. Again, it eliminates ad hoc jobs that tempt users to take data outside automated processes, whether for backup, billing, printing or CRM.

A single source for controls and of metadata relating to data governance is also required in order to ease compliance with Payment Card Industry (PCI) requirements and, increasingly, professional security and confidentiality requirements. The capability to build in governance policies, including authentication rules, user permissions, security policies and other requirements at a single point, gets rid of complexity in financial data flows and allows the blanket application of those policies. It should also, however, enable the incorporation of operational business needs in the setting and enforcement of more sensible policies and controlled processes. In real life, that means setting up the lines of communication to be flexible, within sensible limits.

There is, of course, no such thing as total security: we can look only for a decent mitigation rather than a cure, which is why standards place such a burden in terms of auditability and post-breach procedures on businesses. The problem is not going away, and it will continue to hurt finance and treasury professionals harder than other victims of data breaches. At a higher level, such visible, end-to-end management of content over its entire lifecycle also comprises the generation of enough metadata to enable sufficient auditability and generation of data for security event information management – and for the process of the recovery and redevelopment of processes and technology across an entire business.


Related reading