AlexOakenman - Fotolia
DevOps and storage: APIs and flexibility key
The details of back-end storage are irrelevant to DevOps engineers, but they need enough of it, when they want it, and the ability to manage it programmatically is key
Sitting at the intersection of software engineering and systems administration, DevOps engineers play a prized role in organisations. Their job is crucial. They act as a “force multiplier” for teams that are forward-thinking enough to deploy them, to create efficiencies which previously didn’t exist.
According to a survey commissioned by Dice and The Linux Foundation, 60% of all hiring managers are currently trying to source DevOps professionals. This figure gives weight to the frenetic growth of this relatively new discipline.
But what do DevOps engineers need when it comes to storage? And how much capacity is there for DevOps-oriented storage solutions?
Computer Weekly reached out to industry analysts and thought leaders to get their take.
The stack sets the tone
Patrick Callaghan, enterprise architect and strategic business advisor at DataStax, is somewhat cautious about field-specific products and services.
He notes that in any organisation that employs DevOps engineers, developers and architects will always set the tone. They will dictate the infrastructure that DevOps engineers will ultimately have to work around, and that includes storage technologies.
“In DevOps teams, developers and architects first tend to choose the database they want to work with to support their data, then look at storage options to support that database,” says Callaghan.
“Data created by applications tends to be a secondary consideration compared to getting the functionality working,” he adds. But this, he notes, can prove troublesome to DevOps engineers further down the line.
Fortunately, the latest developer trends show an overwhelming preference towards open source products, in addition to NoSQL databases.
According to research from SumoLogic, of the top five databases deployed on Amazon Web Services (AWS), three are of the NoSQL genre. Furthermore, only two databases in the top 10 are proprietary – namely Oracle and Microsoft SQL.
Open source database technologies tend to lend favourably towards DevOps procedures, as they almost universally include an exposed application programming interface (API), allowing DevOps engineers to write their own integrations and automation scripts.
The storage needs of DevOps
That notwithstanding, it’s evident that DevOps engineers have specific storage needs too.
And as the standing of DevOps builds, it’s inevitable their voices will become more pronounced as IT systems are architected. This is especially true when it comes to storage.
Forrester analyst Naveen Chhabra argues that scale-out storage is particularly well-suited to DevOps workloads.
“These architectures come with the right performance-cost equation for the clients,” he says, adding that they tend to offer the native API access required for automation. Without this, it would be impossible to provision, de-provision and scale in an efficient, programmatic manner.
Jon Topper, The Scale Factory
Chhabra says APIs sit at the heart of the entire DevOps lifecycle. Beyond provisioning, they’re used in change management, capacity and performance characterisation, and separation through de-provisioning.
Jon Topper, CEO of The Scale Factory, agrees. “In 2019, nobody should be setting up storage manually anymore,” he said.
He observes that most storage configuration and provisioning is done via scripted processes. The main advantage of this, he says, is that it ensures a level of consistency across environments, and permits good change control processes.
Make no mistake, APIs are important. But the storage requirements of DevOps engineers transcend this one point. For those tasked with ensuring the optimal performance of operations and IT teams, storage must be malleable and support the creation of new environments based on the needs of developers.
“Think of the application development lifecycle,” says Chhabra. “Sometimes developers need a performant storage [system] for performance testing, while at other times they just need access to storage capacity and data.”
DevOps integrating with other disciplines
The still-vague nature of DevOps has permitted it to absorb other disciplines within the operations sphere, with security being a pivotal example.
Javvad Malik, analyst
For example, security and DevOps are increasingly integrated. Not only are security professionals working in DevOps, but DevOps teams often work hand-in-glove with their counterparts in security. By itself, this isn’t especially remarkable or surprising. As we’ve seen, the information security world has adopted the DevOps philosophy for its own ends, creating the discipline of DevSecOps.
Consequently, we see actors in the storage space integrate security features that are of use to security and DevOps personnel, such as the ability to audit activity, as well as the option to establish policy-driven permissions.
Javvad Malik, an analyst and security awareness advocate, describes security and DevOps as “like two stars caught in each other’s gravity, spiralling ever closer to each other”.
“We may be some way from both disciplines fully integrating, but there is little doubt the movement to bridge the gap and integrate security programmes into DevOps is the future. By doing so, we should see the benefits of greater security being integrated into software and meeting the requirements of the ever-changing landscape,” he says.
It’s the technology. But also, it isn’t
Many of the respected industry experts and major tech analyst houses Computer Weekly spoke to noted that DevOps isn’t about technology, but rather about a way of working.
Or, put more simply, it’s not about the technology, but how you use it. There’s no silver bullet, no ideal tool. DevOps has as much to do with hyper-converged storage as it does with keyboards, they say.
With an emphasis on moving fast and reducing the burden of “busywork” in software engineering, it’s inevitable that procurement choices factor in the things needed to make this a reality. Ultimately, this boils down to API sophistication, as Forrester’s Chhabra touched on earlier.
Regardless of the paradigm – public cloud or private, on-premise or off-premise – what’s crucial is that DevOps engineers are afforded the availability to do their work.
If a storage solution permits DevOps engineers to programmatically deploy and reduce capacity as required, and integrate storage with other tools such as those used for continuous integration and continuous delivery (CD/CI), or manage traffic, it’ll potentially win their favour.
There’s also the salient point raised by DataStax’s Callaghan about how DevOps engineers are often unable to dictate the architecture of the systems they work with. Separately, Topper added that storage is less of a concern for DevOps personnel as it perhaps might be for those making the purchasing decisions.
“Today, delivery teams can be a bit more blasé about how they allocate storage, because infrastructure is way more fungible than it used to be,” he said.
“We’re seeing teams treat storage as essentially infinite, often keeping data around way longer than they need to. This is generally fine because the cloud doesn’t have the same capacity limitations as on-premise hardware, but there are costs associated with this.”
Read more about storage and DevOps
- DevOps brings continuous and agile application. Storage has to fit in and make itself just as responsive to development that does away with the inertia of stifling hierarchies.
- Cloud storage and intelligent storage can help provide DevOps teams with the reliable, fast and flexible storage they require at every phase of the application delivery process.