Financial services companies need to be doing data management right or face the consequences of a data breach, writes Andrew Fitzgerald.
Financial institutions manage a large volume of sensitive information about their customers. However, the protection of sensitive data in line with regulations, both for banks and other financial services organisations, is currently a big challenge.
For these organisations, data backups and the ability to recover from them aren’t just about getting the business up and running after a hardware failure, as important as that is. They are also about much more.
Financial institutions are, quite rightly, subject to a huge array of regulations from those of a general nature such as General Data Protection Regulation (GDPR) to a myriad of others specific to different aspects and services, such as MiFID II.
By their very nature, financial services companies need to be up and running continuously. Any unplanned breaks in service, for whatever reason, from a ransomware attack to an accident to a systems failure, or even, potentially, state-sponsored attacks, simply must be avoided.
They just can’t afford the monetary losses or the reputational damage that would result from downtime of services that allow customers to access their money.
Compliance matters for backups too
When it comes to compliance, there are requirements for backups as well as for live production systems. Consider the GDPR, for example. It requires that organisations must not keep personal data for longer than it is needed, and data must be regularly reviewed to be sure it is still needed.
Individuals also have a right to ask for their personal data to be removed too. How this is done varies from application to application, but ensuring you don’t re-populate an application with data that is no longer required from a backup is a necessity.
There is also a requirement under GDPR to respond to individuals’ requests within a month of them being made. That is a fair period of time, but issues such as ransomware attacks can leave an organisation without access to its complete data for considerable periods, and as we have seen recently, backups are not immune from attack. In fact, they are now a focus for certain attack types, especially those stored on a network-attached storage device.
The basics of backup and restore
In this context, the National Cyber Security Centre advises organisations to maintain recent offline backups of all their most important files and data. Still, the evidence suggests that not all organisations have the kind of backup systems in place that will allow data recovery.
Sophos surveyed 5,000 IT managers in 26 countries for its The State of Ransomware 2020 report. It found that just 56 percent of organisations undergoing a ransomware attack got their data back via backups (26 percent paid the ransom, 12 percent used ‘other means’, and 6 percent didn’t get their data back at all).
The implication in all of this is that the backup is the tool of last resort. But even in that role, it isn’t necessarily fulfilling its purpose. You could infer from this research that most enterprise backups are only able to do the job just over half of the time.
But it doesn’t have to be like this, and for financial services companies that really can’t afford downtime whatever its cause, there is a strong argument that backups need to assume a much wider role.
It is perfectly possible for a backup system to analyse the production environment versus the data it holds in order to detect if any major changes have been made that could in turn signify an attack being made. A modern system can also scan VMs for open vulnerabilities even if there is no attack, to ensure threat prevention can take place.
As mentioned, to ensure a payout cyber criminals are not just attacking the production environment now, but increasingly targeting backup data and infrastructure. This effectively hobbles the “insurance policy” organisations depend upon when disaster strikes.
The attackers are often exploiting weaknesses associated with legacy backup solutions architected before the advent of the ransomware industry. Before encrypting the production environment, sophisticated malware is known to destroy shadow copies and restore-point data. Due to its underlying architecture, this malware makes legacy backup infrastructure easy prey rather than a solid defence against ransomware attacks.
It might seem a little strange to suggest that financial services companies reinvent their approach to data management by paying closer attention to their backups. But it is time to realise that data backups are much more than the ‘necessary evil’ that you create as an insurance policy and file away, never to revisit. Especially, if these backups sit on legacy infrastructure, architected many years previous.
Since the financial crisis, there has been a wave of regulation with a significant part of it aimed at ensuring banks have sufficient capital and liquidity.
Now, in 2020, backups are both a living insurance policy against the times when the worst happens (and in some shape or form it inevitably will), and a part of your data management system that is as relevant to regulatory compliance requirements as your live systems are.
These improvements to modern data management will bring financial services companies and banking systems through the COVID-19-related economic crisis in reasonable shape, and afford themselves a head start for future data-driven innovation. Let’s hope it doesn’t take a specific problem before the community realises this and gets its act together.
Andrew Fitzgerald is sales director for Western Europe and Sub-Saharan Africa at Cohesity.
The views and opinions expressed in this Viewpoint article are solely those of the author(s) and do not reflect the views and opinions of Fintech Bulletin.