tashka2000 - Fotolia
The next decade in enterprise backup
Backup set to adapt to deliver more security, ransomware protection and flash-speed restores, and deal with the complexity of multi-, hybrid cloud and container-based operations
Backup is as old as IT itself, and it’s had to adapt as the IT landscape around it has evolved.
Now we find backups at the centre of ransomware recovery strategy, as a source of business data to be analysed, or holding datasets that need restores at the speed of flash.
Meanwhile, also, multi- and hybrid cloud operations are a daily reality, as are containers and microservices, and backup has to adapt to protect the data generated and store during the complexity of such operations.
In this article, we look at how enterprise backup is set to change to meet the realities of the coming decade, including via autonomous discovery of data and assignment to the most appropriate media.
Computer Weekly surveyed some of the leading suppliers in backup and data protection to see how they think backup and data protection will develop.
The rise of ransomware over the past few years has forced firms to look again at offsite backups, including to less fashionable media such as (optical) WORM drives and tape. Storing data off-site provides an air gap between the backup and the production system, and is one of the few ways to protect data against malware.
But backing up data in this way takes considerable effort and expertise. Air gaps need physical transport and protection for the backup media, and recovering data from an off-site archive is slow. And there is always the risk that malware has somehow contaminated backup tapes. Suppliers have responded by introducing “immutable” backups. Immutable copies of data and snapshots of protection systems can run locally and on the same type of media as the main system. This makes recovery much faster.
Read more about backup
- Snapshots vs backup: Use both for optimum RPOs. We look at snapshots and backup, the key difference, their pros and cons and how best to use the two approaches in an integrated data protection strategy for optimal RPO.
- Cloud-to-cloud backup: When native cloud protection is not enough. There is a certain amount of protection built into cloud services, but it has its limits and full data protection requires that cloud data is secured with cloud-to-cloud backup.
Immutability is set to become a standard feature of backups, argues Paul Speciale, chief marketing officer at supplier Scality.
“Immutability will be an indispensable functionality of enterprise backup in the coming decade,” says Speciale. “Backups are the ultimate insurance policy against the ever-growing number of ransomware attacks,” he says. Systems such as Amazon’s S3 object lock – which keeps data behind a virtual air gap – point to future trends.
Suppliers also expect artificial intelligence (AI) technology to improve defences by scanning applications and user activity for anomalies. These models will continuously update, according to W Curtis Preston, chief technical evangelist at Druva, so they can keep up with emerging threats.
At Acronis, Alexander Ivanyuk, senior director of product and technology positioning, talks about backup and storage “self defence”, with real-time ransomware detection and automatic recovery.
In essence, chief information officers should be able to assume their backups are safe without taking extra steps to protect them.
Hardware consolidation, tiering and backups to flash
Currently, most backups are written to spinning disk, whether in-house or through a cloud or backup-as-a-service provider. Longer-term archiving is still to optical drive or tape.
But these media are slower and less reliable than solid-state storage.
Patrick Smith, field chief technology officer at Pure, predicts this will change over the next decade, as larger storage capacities on flash media bring it close to price parity with disk. “Performance, simplicity, and space, power and cooling benefits will drive further adoption and tip the balance towards flash,” he says.
Suppliers also expect to see hardware consolidation. In turn, this should make it easier to use tiers of capacity to provide the most effective and efficient backup storage.
“One way is to consolidate the advantages of the latest hardware and software in purpose-built hyper-converged appliances or hybrid cloud architecture,” says Sergei Serdyuk, vice-president of product management at Nakivo. “This means tighter integration with leading platforms and hardware products for enterprise backup.”
Meanwhile, Scality’s Speciale, for example, predicts that a single tool will be able to handle short and long-term storage, so users will no longer have to think about storage tiers.
Hybrid and multi-cloud support, and containers
Almost all suppliers expect backup tools to support multi-cloud and hybrid cloud over the next few years, if they don’t already.
Part of this is driven by compliance, and partly by practicality. Although suppliers expect the market for storage- and backup-as-a-service to grow, they predict firms will still want localised backup services or ones that guarantee data stays in a certain geography. But other data will be generated by cloud-native applications and needs backing up close to the source. Suppliers will need to support this.
“IT teams don’t want multiple backup solutions for different pieces of software,” says Dan Middleton, vice-president for UK and Ireland at Veeam. “That’s messy, time-consuming and inefficient. Enterprises are seeking as few vendors for data backup and protection as possible, and this will be a key characteristic over the coming decade.”
At Pure, Patrick Smith expects use of containers to continue to grow, so container support will become a standard feature of enterprise backup tools. This point is echoed by Veeam’s Dan Middleton. He expects data to become more mobile – moving between applications or containers – and backup needs to support this.
Autonomic, automatic or continuous backups
Perhaps the most important trend, however, is for backups to work without any user intervention, or even knowledge.
In part, this is to reduce workloads on IT teams and reduce human error. It should speed up backups and restores, and make systems more resilient against accidents and deliberate attacks on data. It also has the further advantage of being transparent to the end user and less disruptive to day-to-day work.
“Over the next decade, enterprise data protection solutions might shift away from the traditional periodic backup pattern and towards a continuous approach, as it seems to be the most promising direction,” says Nakivo’s Serdyuk.
Other suppliers agree. Although, as Druva’s Preston warns, most backup systems are not now fully autonomous, but that functionality is in demand and suppliers will need to modernise. “Autonomous solutions will be equipped to handle the complexity of modern data environments, set backup schedules, apply patches and upgrade software without depending on any people,” he says.
At Veritas Technologies, Barry Cashman, regional vice-president for the UK and Ireland, is another advocate of autonomous or autonomic backup. “Data protection products will increasingly take on the day-to-day tasks of identifying data sets to protect, monitoring protection data for issues and autonomously dealing with those issues: self provisioning, self monitoring and self healing,” he says.
And backup tools will become more context- and application-aware, to prioritise where data is stored according to criticality, its usage profile and how quickly the business needs to recover. In effect, they will set their own service-level agreements, recovery point objectives and recovery time objectives.
This is all the more important for microservices and environments such as Kubernetes, which will create IT infrastructure that is too complex for humans to manage manually.
More discipline about what to store – and for how long
Not all changes to backup and recovery, however, will be technical. Suppliers expect regulation to increase, and that firms will need to do more to ensure they retain appropriate data for appropriate timescales.
This requires a granular understanding of the business’s data assets.
In future, backup and resilience will be more integrated into applications, making them more robust. But firms will see benefits if they are disciplined about the data they hold.
That reduces the burden on backup and recovery systems and improves compliance. But as firms gather ever more data, that discipline is critical.
“Organisations need to that avoid ‘storing everything forever’,” says Pure’s Smith. “Only by doing this can organisations sustain backups for recovery, regulatory, and compliance and ransomware mitigation purposes, with the growing dataset sizes we’ll see in the next five to 10 years.”