Edge computing environments: what you need to know
An edge case in business sits at the limits of what's viable, and edge computing requirements suggest a similar struggle
The saying goes: “If you’re not on the edge, you’re taking up too much space”. And compute itself is now moving to the edge, forcing datacentre operators to wring the last drops of productivity from their infrastructure, ahead of a future supporting multi-sensor internet of things (IoT) devices over 5G for machine learning, and even artificial intelligence (AI).
Jennifer Cooke, research director of cloud-to-edge datacentre trends at IDC, says datacentre operators need to start thinking about how many systems they will need to roll out, and the people they will need to support them. “Cost becomes the prohibitive factor,” she says.
Edge will take different forms. A system to support the operation of autonomous vehicles needs a lot of infrastructure to deliver low-latency levels of responsiveness. Such business cases remain “a little bit” immature – and a long way from current announcements around cloud-in-a-box offerings.
How does Cooke define edge? “A cooling supplier gave this analogy: Lamp posts are not going to be where your dog lifts his leg any more; it’s where all the tech will be,” she says.
Forging ahead
Yet datacentres cannot hang back on these investments.
“Many organisations will need partners to help them. Organisations have a lot of trouble keeping track of stuff on site already – who has touched it, and where the data actually is, who will protect that data, where is the infrastructure? Even more importantly: who will pay for it?” Cooke says.
Russell Poole, UK managing director at Equinix, notes its own survey shows that businesses are forging ahead with investments that support AI, IoT and 5G across the UK, with countering distance-related latency across distributed networks seen as a competitive differentiator.
“Almost half (45%) of businesses in the UK are prioritising moving their infrastructure to the digital edge as part of their organisation’s technology strategy, in turn facilitating interconnection between new and existing partners, and customers around the world,” he says.
Rhonda Ascierto, vice-president of research at the Uptime Institute, says edge may pay off anywhere agility is needed across many distributed machines – from cloud gaming (Jitter, Wag) and augmented reality (AR) to Industry 4.0 and an Industrial Internet of Things (IIoT).
“We are seeing a lot of true local edge capacity in factories of all different types, smart factories, and it’s not that they are retooling completely all their factories at once, it’s generally an iterative process,” she says.
How many times does the data have to hop before it is acted on? Then you need that compute on-premise: the financial justification for a highly instrumented multi-sensor smart factory might be speed and accuracy. “You really need to have very low latency with that stuff,” says Ascierto.
Operators need not wait for hordes of ultra-low latency customers, though. Ascierto points out that many applications considered for edge actually live “quite happily” within a more middle zone. If a distributed application is mission- or life-critical, there can be a rationale.
Retail, healthcare, and financial services already have distributed IT but need to upgrade their datacentre capacity, with net new edge coming online. This can mean existing server closets with “half a rack and a bunch of networking kit” replaced by micro-modular all-in-one datacentres with huge capacity in a smaller footprint incorporating cooling and backup power, says Ascierto.
An investment in multiple parts
Edge datacentres should be lights-out if remotely managed and highly instrumented. “Don’t think of your edge datacentre footprint as one site: design a whole function, because they will be sharing data across this mesh configuration,” says Ascierto.
Operators should invest in component-level resiliency, with complementary resiliency up the stack, to support remote monitoring with strong instrumentation, multiple sensors, multiple power meters and video feeds.
“You might want digital-eyes videos, power/cooling visual monitoring, and you’re going to see more robots and virtual reality technicians needing the right information – so a site can come back online through a remote fix, or as quickly as possible,” Ascierto says.
Datacentre management as a service (DMaaS) can be worth looking at. Buying one from a big player such as Schneider can offer big-data analysis from many thousands of customers.
Networking for the edge should be software-defined. Look at single-path routing with delay considerations for mission-critical applications, designed-in concurrency and testing. Testing is not just best-practice at the edge but absolutely critical, according to Ascierto.
“Think about the level of redundancy you really need for an edge site. Because of the way edge IT is architected, you probably need far less redundancy even for a mission-critical edge application that you might assume. Again, test your service-level availability, customer infrastructure topology, validate your design, and identify the weak points,” she says.
Ascierto also says to look at service resiliency and transaction orientation. Consider a mesh-type configuration, automated failover, and automated load-balancing, using alternative platforms and nodes. All those functions should be highly monitored and tested regularly – which, at scale, suggests a decision-making role for AI.
“Mission-critical edge is really too complex for human beings,” Ascierto says. “But your decision of whether to do that in-house or as an outsourced service really depends on your business appetite and level of investment for those skills. In-house or not, you also need to be very comfortable with edge environments, remote data, resilience, and cloud environments.”
At scale, with many different edge environments to consider, the cost to compute and store the data can be “unreasonable” and too expensive to do on-premise, she warns.
State of play
Ascierto points to big sports events offering augmented reality (AR) add-ons. Smart masks, like 3D glasses, could receive and display team data via mobile. When looking at a player on the field in real life, the stats about that player will come up on your smart mask, for example.
“A lot of compute will travel with that team in a micro-modular datacentre. I was invited by a storage vendor to a big concert recently at Madison Square Garden, a major light show. That’s the same sort of thing: massive amounts of storage and compute that travel with the band,” she says.
Edge will evolve, from local deployments to regional, to the core; from regional to regional, or from regional to core. Increasingly, users won’t want to rely on public wide-are network (WAN) to relay data between datacentres or integrate data from different applications, especially since IoT apps mean a lot of integrated data.
“Colocation provider VPNs and virtual interconnections are able to offer a kind of private routing,” Ascierto says. “You can track where the data is routed; it doesn’t go on the internet and a black hole appears at the core.”
Edge computing startup Vapor IO signed a deal with network provider Cloudflare in January to roll out on the former’s Kinetic Edge integrated edge colocation, networking and exchange services platform.
Nitin Rao, head of global infrastructure at Cloudflare, says the interconnection ecosystem includes small datacentres at wireless aggregation hubs, owned by investors. This will enable code to run in more places across a mobile network, thereby improving the performance of internet applications.
“It dramatically improves the experience of users beyond what was remotely possible with a traditional centralised datacentre,” says Rao. “Every piece of code runs on hundreds or thousands of machines by default. Scaling to millions of requests per second becomes trivial.”
Computation should be done and data stored close to the users interacting with it, yet modern distributed databases remain focused on the goal of replicating data everywhere rather than intelligently sharing it, he adds.
Matt Trifiro, chief marketing officer at Vapor IO, says that while Vapor IO is primarily focused on last mile networking, edge means investing in the emergence of highly automated, remotely operated, lights-out facilities. “It requires a very different skill set to operate thousands of highly distributed small datacentres than a smaller number of large datacentres,” he says.
Benefits include that latencies can be reduced by hundreds of milliseconds while boosting reliability and saving resources. Caching content at the edge means it does not have to be fetched across the internet or responses regenerated.
Where it’s all heading
Peter Hewkin, founder of SmartEdge DC, agrees that the most efficient future indicates a mesh redundancy environment. Multiple edge nodes with replication of data across a distributed network mean if one fails, the others can pick it up in fractions of a second. Fully lights-out, hardened micro-modular datacentres can be scattered across a city or small metro area – think 69KW every 40km.
“The edge is a location – it’s the edge of a network, and the only way of delivering is you’ve got to have them replicate what happened with mobile phones,” he says. “Yet 5G needs another 50,000 masts on top of the 34,000 or so in the UK. So you need tiny aerials on the side of buildings and so on.”
Ultimately achieving this kind of “true” edge is costly, potentially requiring some sort of government funding if data is to become a utility, or if higher-end applications such as autonomous vehicles are to happen.
It’s probable, says Hewkin, that massive efficiencies remain untapped – not only through waste heat and the like, but by the practice of “bouncing” so much data all around the world via subsea cables, even though it’s only needed in one place.
Another idea entails working out how to “switch off” access to the vast archives in datacentres that are rarely used. Hewkin suspects a solution might be devised around controlling retrieval hour by hour – similar to the way financial services companies get allocated access to share information.
“There’s a need to do something like that,” Hewkin says. “Or look at Facebook ‘cold storage’: it had to do something, because it has so much data. Everyone uploads all these photographs and shares them once, and then they have to hold them live. That’s what’s happened, the internet is stuffed with junk, which consumes energy.”
So, the better the efficiency gains, the more there will be to invest in innovation.
Read more about edge computing environments
- Edge computing is currently one of the most important trends in IT that is likely to complement cloud by supporting new and emerging workloads.
- Aiming to boost its plans to make truly autonomous networks a reality, Japanese operator Rakuten opens a new lab in collaboration with Queen’s University Belfast to develop edge computing hub.