Datacentres in 2030: What the future holds

In this guest post, Tony Jacob, vice president of design and construction for Europe, Middle East and Africa at colocation giant Digital Realty predicts what the next decade holds for the datacentre community

It’s predicted that cutting-edge technologies – such as artificial intelligence, Internet of Things, 5G and blockchain – could add $721 billion to the global economy over the next decade through the creation of new jobs, businesses, and even entire industries.

Notably, one of the things that these technologies all have in common is that data at their core. It comes as no surprise when you consider that there’s now 294 billion emails, five billion internet searches and 482.6 billion electronic financial transactions every day, and that’s without touching on the data use we don’t see – from shipment tracking to factory automation.

And the data explosion is only going to get bigger. Experts suggest the number of connected devices is projected to exceed 75 billion worldwide by 2025, which equates to a fivefold increase in ten years. That’s a lot of data being produced by a huge range of technologies, like digital billboards, smartphones, wireless fridges, and the rest.

And, as the amount of data continues to grow in complexity and size, so will the requirement for apps, services and technology that help make sense of that data.

These end-user trends are what is pushing businesses to transform their data infrastructure, as they adapt to new technologies and push the boundaries in search of new opportunities.

Datacentres at the heart of the digital economy

Much of that transformation will start in the datacentre. Like a human heart, datacentres keep everything connected and ticking along nicely. And, as the amount of data being produced increases in complexity and size, the more vital the datacentre becomes.

However, technologies like AI and IoT are moving data processing out of the datacentre and fundamentally re-architecting the network. Data is no longer solely being produced in concentrated areas; it’s being produced everywhere and anywhere.

This means the traditional datacentres you find in enterprise parks dotted around the globe aren’t going to be fit for purpose for every scenario moving forward. We will need a data infrastructure that processes data everywhere, from small, real-time loads as close to the end user as possible, through to heavy, mission critical lifting at the core.

That’s if you want the technology that promises to revolutionise our lives to become a reality.  From the development of new drugs to fight pandemics, to industrial processes that use renewable energy to help crops grow in extremely harsh conditions. Some of the technology being worked on right now wouldn’t have been conceivable a few years ago (but data, combined with a fit-for-purpose digital infrastructure) is slowly making them a reality.

Take autonomous cars for example. These vehicles must make instant, intelligent driving decisions extremely quickly in order to respond to hazards that may arise unexpectedly. This is only possible if they are able to process data in near real-time. But we are making real progress on the technology front, from both a datacentre perspective and autonomous vehicle perspective, to facilitate this.

By 2030, it is predicted that one in ten cars will be fully automated. Assuming the demand for autonomous vehicles only increases over the next ten years, we could have to power over 100 million autonomous vehicles.

Each vehicle will have dozens (at least) of peripheral sensors capturing data. Some of that will be processed in the car, but they will not have the on-board compute power to make every decision.

To coordinate millions of autonomous vehicles on our roads all making autonomous decisions simultaneously, we need that on-board compute to be complemented by instant decision-making at an edge that’s never more than a few hundred metres from the car, and complex decision-making at the core.

It all starts with 5G. Cars, smartphones, fridges, streetlights, bins and millions of other devices will relay their data across 5G, faster and at greater volumes than imagined for our current networks. Small cells are already starting to deliver 5G, but across the next decade they’ll proliferate to fog across cities.

Behind them, we’ll see micro datacentres emerge – small, nimble compute hubs on the sides of busy high streets. Organisations will be able to deploy the pods quickly in areas of demand, looking to you or I like a telecoms cabinet or waist-high pod.

They will collect data from small cells and allow rapid processing of light data-loads in near-real time, with absolute minimal latency. These micro-datacentres will allow us to make time-critical autonomous decisions quickly and efficiently, while complex compute and analysis, such as the coordination of every autonomous vehicle on a city’s roads, are happening at the core.

Getting to the core of the datacentre

Finally, in this transformed data infrastructure, the core. While the emergence of vastly new datacentre architectures will move some compute away from traditional datacentres, their value won’t be diminished. Don’t forget, we’re living in the era of big tech – as technology continues to grow in complexity, so does the data it produces.

While micro-datacentres will have their own uses, they simply won’t have the capacity or raw power to process the almost unquantifiable amounts of data being produced by technologies such as advanced AI or millions of IoT devices across a city.

This will come from the traditional datacentre. Some of these will be larger, and all will be more modular, allowing them to rapidly switch purpose according to need, but ultimately, they won’t look vastly different to the datacentre you see today.

The datacentre infrastructure of the future is an interconnected, flexible platform, not one size fits all. In a world where data has never been so vital, organisations will have access to nimble, rapid-processing micro-datacentres on the one hand, and larger, traditional datacentres capable of complex, heavy-lift processing on the other.

Ultimately, being able to deliver the compute and latency organisations need, wherever and whenever they need it.

The next decade will be astonishing; both in terms of opportunities and challenges. Today’s coronavirus looks likely to set back economic growth, while climate change will mean we need to find ways to uncouple growth from environmental impact.

No doubt, they could slow the progress of the technologies driving change in data infrastructure, but they won’t stop it, the economic and human opportunities presented by technologies like AI, 5G and IoT mean we’ll continue to adopt and innovate.

And 2030 is still a little while away, giving plenty of time for ‘blips’ We’ll continue to see datacentre architecture adapt and evolve to support cutting-edge technology and mirror the huge transformation that’s happening around the world.