bdavid32 - stock.adobe.com
Edge storage: What it is and the technologies it uses
Storage is moving from the datacentre to the edge. We define edge storage, look at the characteristics of edge computing that shape it and the technologies it requires
Large, monolithic datacentres at the heart of enterprises could give way to hundreds or thousands of smaller data stores and devices, each with their own storage capacity.
This driver for this is organisations moving their processes to the business “edge”. Edge computing is no longer simply about putting some local storage into a remote or branch office (ROBO). Rather, it is being driven by the internet of things (IoT), smart devices and sensors, and technologies such as autonomous cars. All these technologies increasingly need their own local edge data storage.
Industry analysts Gartner confirm business data is moving from the datacentre to the cloud and the edge. The firm identifies four use cases for edge storage: distributed clouds and datacentres, data processing at the edge, content collaboration and access, and digital ingest and streaming.
This isn’t an exhaustive list – applications such as autonomous vehicles that sit outside enterprise IT are driving edge computing too. Meanwhile, industrial processes, sensors and IoT are all drivers that push more computing to the edge.
The market for edge storage is being shaped by changes in storage technology and by applications for edge computing. Increasingly, edge devices need persistent storage that is robust and secure, but applications also demand performance that goes beyond the SD or micro-SD cards found in early generation IoT devices and single board computers.
Where is the edge?
A few years ago, edge computing was most closely associated with remote or branch office (Robo) deployments. For storage, Robo was about providing at least some level of backup or replication to secure data, especially if a device failed, and caching or staging data before sending it to the datacentre for further processing. This batch-based approach worked well enough in retail and other environments with fairly predictable data flows.
But adding storage by way of a networked PC, a small server or a NAS device only really works in office or back office environments, because they are static, environmentally stable and usually reasonably secure.
Today’s business edge is much larger and covers much more hostile operating environments. These range from the factory floor, with edge devices attached to manufacturing equipment and power tools, to cameras and other sensors out in the environment, to telecoms kit and even vehicles.
Enrico Signoretti, an analyst at GigaOM, describes these environments as the industrial edge, remote edge or far edge. Storage needs to be reliable, easy to manage and – given the number of devices firms might deploy – a cost-effective solution.
Characteristics of an edge storage system
Edge applications require storage to be physically robust, secure, physically and virtually – often encrypted – and able to withstand temperature fluctuations and vibration. It needs to be persistent, but draw little power. In some cases, it also needs to be fast, especially where firms want to apply artificial intelligence (AI) to systems at the edge.
Alex McDonald, Europe, Middle East and Africa (EMEA) chair at the Storage Networking Industry Association (SNIA), says that edge storage includes “storage and memory product technologies that provide residences for edge-generated data include SSDs, SSD arrays, embedded DRAM [dynamic random-access memory], flash and persistent memory”.
In some cases, storage and compute systems need to be adapted to operate in much wider range of environments than conventional IT. This requires physical robustness and security measures. Single-board computers, for example, often rely on removable memory cards. Although encryption protects against data loss, it will not prevent someone physically removing the memory module.
“Ruggedised and enhanced specification devices will support environments that require additional safeguarding in embedded applications, from automotive to manufacturing,” says McDonald.
Organisations working with edge computing are also looking at storage class memory (SCM), NVMe-over-fabrics, and hyper-converged infrastructure (HCI).
Hyper-converged infrastructure, with its on-board storage, is perhaps best suited to applications that may need to scale up in the future. IT teams can add HCI nodes relatively easily – even in remote locations – without adding significant management overheads.
But for the most part, edge computing’s storage requirements are relatively small. The focus is not on multiple terabytes of storage, but on systems that can handle time-sensitive, “perishable” data that is then analysed locally and passed on to a central system – usually the cloud – or a combination of both.
This requires systems to be able to perform immediate actions on the data, such as performing analytics, before passing it on to a central store or process. This data triage needs to be nimble and, ideally, close to the compute resources. This, in turn, has prompted interest in NVMe-over-fibre channel and storage-class memory.
And, by putting some local storage into the device, systems designers can minimise one of edge computing’s biggest challenges – its demands on bandwidth.
Edge computing’s data problem
Organisations that want to add data storage to their edge systems do so, at least in part, to reduce demands on their networks and centralised datacentres, or to reduce latency in their processing.
Some firms now have so many edge devices that they risk overwhelming local networks. Although the idea of decentralised computing connected to the cloud is attractive, in practice network latency, the possibility of network disruption and even cloud storage costs have prompted device manufacturers to include at least support for local storage.
A growing number of vendors also make edge appliances that work alongside (or more accurately, just behind) IoT devices to gather data from them. Some are data transfer devices, such as Google’s Edge Appliance, while some take on some of the AI processing itself, offloading it from the network.
By doing this, systems architects can provide a more robust form of edge computing. More data is processed near to the sensor or device, decisions can be made more quickly via analytics or AI, and the amount of data sent to the corporate LAN or cloud service can be vastly reduced.
Adding storage to the edge, directly or via appliances, also allows for replication or batch-based archiving and makes it easier to operate with intermittent or unreliable connections, especially for mobile applications. Jimmy Tam, CEO of Peer Software, says that some vendors are integrating hard disk drives in combination with SSDs to allow devices to store larger data volumes at a lower cost.
“In the case where the edge storage is mainly focused as a data ingestion platform that then replicates or transmits the data to the cloud, a larger proportion of storage may be HDD instead of SSD to allow for more data density,” he says.
Emerging storage technologies for the edge
It seems unlikely that any single storage technology will dominate at the edge. As Gartner notes in a recent research report: “Although edge storage solutions possess common fundamental principles, it is not a single technology, because it needs to be tailored to the specific use cases.”
Nonetheless, Gartner expects to see more data storage technology being “edge ready”, including datacentre technologies that work better with the demands of the edge.
IoT and other edge vendors will work to improve storage performance, especially by moving to server and workstation-class storage, such as Flash, NVMe and NVMe-over-fabrics, as well as storage-class memory, rather than USB-based technologies such as SD or micro-SD.
But the real focus looks set to be on how to manage ever larger numbers of storage-equipped devices. Developments such as 5G will only increase the applications for edge computing, so firms will look for storage that is not just rugged but self-healing and, at least in normal operations, can largely manage itself.