Inside NetApp’s cloud and AI strategy

NetApp’s senior executives talk up the company’s efforts to support AI initiatives and deliver first-party storage services on public cloud platforms

The growing adoption of artificial intelligence (AI) has been driving demand for all manner of IT infrastructure, from cloud infrastructure and AI chips to storage systems that hold the crucial data needed to support AI initiatives.

NetApp, known for its network-attached storage (NAS) technology, has been reinventing itself to deliver a new breed of storage infrastructure offerings to support emerging cloud and AI workloads as it taps new market opportunities.

In an interview with Computer Weekly, NetApp’s president Cesar Cernuda, and senior vice-president and general manager for Asia-Pacific Andrew Sotiropoulos, outline the company’s strategy to cement its position in the market through first-party public cloud storage services, support hybrid cloud workloads and tap growth opportunities in the Asia-Pacific region.

NetApp has made some big moves over the past couple of years by offering first-party storage on public cloud services and supporting AI initiatives. What is the broader strategy to cement your position in the market?

Cernuda: We’ve been recognised as one of the top storage companies for many years and that’s not a minor achievement, with thousands of customers trusting our technology to store their most important data.

The past 10 years have changed a lot. The world is moving more into cloud and our customers that are storing data in their own datacentres have started to move their data and applications to hyperscale datacentres. That requires our technology to become cloud-native as well.

And so, we made a bold decision to embrace cloud, even as many of our competitors had tried to convince customers that the cloud is more expensive and not secure. We told our customers that we were going to support their journey to cloud, and they could decide which apps and how fast they were going to do it.

It's true that as you build some of the data services, you’d start using words like data platform. But as the world evolves, we’re positioning ourselves as the layer which data platforms can operate on
Cesar Cernuda, NetApp

Our job is to make sure our technology is an enabler and not a stopper, so we brought OnTap – our operating system for storage – natively to each of the hyperscalers as a first-party service. We’re the only company that has managed to do that. This has given us a huge leapfrog in terms of helping customers in their digitisation and overall cloud journey.

In the past 18 months, everyone has been talking about AI and generative AI [GenAI], but what’s behind all of that is the data. If you don’t have the right technology for storing the data, it is going to be very difficult to use your models with the right performance, security and reliability.

Whether customers run their models and store unstructured data in a public cloud, private cloud or on-premise environment, we’re with them in their AI journey. And because AI runs on data, data runs on NetApp, and so AI runs on NetApp – that’s our strategy.

Can you dive deeper into why having a first-party storage service is important to customers?

Cernuda: There are many elements to that. There are customers that want to run their applications on-premise while others want to run everything on the cloud. With many of our customers doing hybrid cloud, we’ve seen applications being moved to the cloud and some repatriations back on-premise as well.

So, what do they do with storage? For customers that want to run applications on public cloud and want to manage their storage on their own, we can give them Cloud Volumes OnTap. But if they want the hyperscalers to run it for them, they’ll need a first-party service.

That’s why we worked with Microsoft on Azure NetApp Files, and we have similar services with Amazon Web Services (AWS) and Google Cloud. If you want to run an enterprise application on Azure that’s mission-critical, in most cases you’re going to end up using Azure NetApp Files because that’s the only way that you can run that application in the cloud natively.

Can you provide some regional insights on how customers have been responding to this strategy?

Sotiropoulos: We’re strong with customers that have been with us for a long time and want the choice to move on-premise or off-premise. In the past 10 years, the propensity to be cloud-first has morphed across customers in APAC. When you have an OnTap customer, the first-party story makes a lot of sense because we want to be able to provide choices for customers as they formulate their data strategy on what they want to put on-premise and off-premise.

Cernuda: I was talking to an important customer that was moving some applications to the cloud and they saw how complex it was, particularly with data management. They also wanted to start using GenAI, but it’s super expensive to run all of that on hyperscalers with their data on-premise. What they would love to do is to be able to run applications on different hyperscalers and avoid having different data models, with a single operating system for storage.

That’s a huge enabler because one of the biggest challenges they face is re-architecting their applications as they move from one environment to another. As companies prepare to shift to cloud, they can move their on-premise environment to NetApp first, then move their applications to the cloud while still being able to run and manage their storage. We have thousands of customers already running their storage with us using our technology on Microsoft Azure, AWS or Google Cloud.

Besides providing access to a first-party storage service on hyperscalars, what else is NetApp doing to help companies manage the cost of GenAI workloads?

Sotiropoulos: There are regional and global aspects to that. In the regional context, as you would expect, it’s not a straight line. Some customers are deploying GPUs [graphics processing units] locally in small pods to do in-house testing before they go into production. Others are using Nvidia cloud providers because they don’t have the cost structure to do it themselves.

AI, like cloud, is such a broad technology that you have got to personalise it. How do I deploy it in an insurance environment? Or a customer-centric environment within a bank? There’s a continuum of AI being deployed depending on the industry, and everyone is working their way through that.

In last 10 years, the propensity to be cloud-first has morphed across customers in APAC. When you have an OnTap customer, the first-party story makes a lot of sense because we want to be able to provide choices for customers as they formulate their data strategy on what they want to put on-premise and off-premise
Andrew Sotiropoulos, NetApp

To help customers deploy AI efficiently for specific environments, we’re partnering with Nvidia as part of our continuous relationship. AI is at the same early stage that cloud was over 10 years ago – there’s a desire to use it but it’s not clear what it means for a bank, insurance company and automaker. There’s a lot of testing and application rewrites going on at the moment.

Cernuda: This is very important from our point of view. Once again, our commitment is to let you as a customer decide how you’re going to run these models in a public cloud. You might end up saying that the public cloud is expensive and want to do things on-premise. That’s where we’re serving our customers and that’s the sweet spot for us. While we have great storage technology, customers are turning to us to deploy an intelligent data infrastructure platform that is increasingly hybrid.

Andrew, you talked about how different organisations are adopting GenAI – does NetApp have a strategy to develop solutions for different industries?

Sotiropoulos: Globally, we are looking at industries like healthcare, as well as financial services, where banks are using bots to free up people to provide higher value services. Then, there are market specific ones like gaming, automotive and manufacturing in Korea and the semiconductor industry in Taiwan. It varies across Asia-Pacific, and we’ll try to be in spaces that we think are going to be global from a capability point of view. We’re also offering the same value proposition with OnTap, connecting to the data strategy of our customers across industries.

With the focus on data and AI, how have your customer profiles changed over time? Besides storage admins, are you increasingly engaging data science teams?

Cernuda: Yes, we used to talk to infrastructure and storage people, but things have changed. When you think about AI and GenAI, you’re talking to the chief data officer. You’re also talking to chief financial officers and how you’re going to optimise their overall costs. We’re talking to data scientists, chief data officers and chief technology officers.

Sotiropoulos: When you offer singular enterprise storage, you tend to talk to storage admins and heads of infrastructure. But as you move into the cloud environment, we get into different personas like chief information security officers [CISOs] with our ransomware protection offerings.

In the block storage environment, we talk to people who are developing applications on Kubernetes, because as they are developing their application strategy, they need to understand the storage platform that they’re going to be putting the application on and whether it’s on-premise or off-premise. The personas we’re talking to have evolved over the past few years simply because our offerings are expanding.

When I spoke to NetApp executives several years ago, the conversation was more about NetApp being a data platform, a term that’s also used by the likes of Snowflake and Databricks today. Is that still on the cards?

Sotiropoulos: It was relevant and is still partly relevant. But as world continues to evolve, we see customers deploying capabilities today that didn’t exist five years ago. We’ve lifted the story, not only about the data fabric, which was foundational back in the day, but now we’re going into the intelligent part of it because the customer wants to know how to use the data they are generating, where to store it optimally, and how to access it whenever they need it.

And underpinning all that is how simple it is to deploy and whether it is cost prohibitive. We have offerings across all of that, with ability to optimise your data where it’s sitting, the ability to store it on-premise, and working through the three hyperscalers, we feel that we provide differentiation via the simplicity of being able to support your data strategy whether you’re a CISO, storage admin or doing application modernisation.

Cernuda: Snowflake will benefit from the infrastructure that we have built, and we don’t compete there. Now, you might say that we also provide some of the data services on storage that somehow intersect with Snowflake. I’ll say, sure, but Snowflake needs the intelligent data infrastructure that we’re strong at and we’re willing to have customers use Snowflake on top of that. It’s true that as you build some of the data services, you’d start using words like data platform. But as the world evolves, we’re positioning ourselves as the layer which data platforms can operate on.

What is the impact of NetApp’s strategy on the company’s growth so far?

Cernuda: I will highlight two areas which I think are very important. One is the overall growth around flash. That has been our big bet and we’ve been winning share in our flash business in two big areas – file storage in the NAS environment and block storage in other environments. Historically we are known as a market leader in NAS, but the reality is that out of our 40,000 customers worldwide, 20,000 of them are using already block storage with us because of our vision on unified storage.

And out of those 20,000 customers, 5,000 are using only block storage. We recognised that, so we made an announcement in the mid-range space with ASA [All-Flash SAN Array], which is block storage that doesn't require file. But still, our vision of being an intelligent data infrastructure company is all about unified storage which can be used with our ASA and AFF systems.

The second area is cloud storage. Three years ago, we were a $100m cloud ARR [annual recurring revenue] company. Today, we’re over $600m and the main driver of that growth has been around cloud storage.

Our growth lines are going to be around public cloud storage and on-premise storage. We’re also going to see a lot of growth from storage as a service as well as platform as a service where storage is a component of it. A lot of new customers are looking for options to either manage the storage on their own, consume it as a service or have a third party manage their storage for them. There are different elements, but there’s going to be more growth from subscriptions as well.

Read more about storage in APAC

Read more on AI and storage