NetApp Insight 2024 - Live show report: day zero

Intelligent data infrastructure company NetApp staged its annual user, practitioner & partner conference this week in Las Vegas. 

Attending a “day zero” pre-event session of briefings before the show proper kicked in, the Computer Weekly Developer Network team dove deep on all things data to learn how the backbone that we all depend upon is increasing in functionality, broadening in terms of application touchpoints and… of course, becoming more intelligent. 

First up was George Kurian, NetApp chief executive officer. Talking about how one of his firm’s core tenets is to co-innovate with industry partners, Kurian was upbeat about our current interest in tracking human behaviour with AI; and secondly, our progression to now analyse human behaviour; then, thirdly our ability to crunch these processes with the mathematical tools that we now have with the benefit of computational assistance.

NetApp is now working hard to deliver what it says is intelligent infrastructure for AI and be able to implement security policies on data throughout its entire lifecycle i.e. specifically, as that data is ingested into AI models and then channelled for onward use.

“The idea of using machines to analyse data has made a lot of progress in the last five years. Firstly the software algorithms that power large language models have exhibited a very human ike ability to perform … and they can perform across multi-modal use cases (i.e. via voice or text or images or other),” said Kurian. “Plus, they have also moved to work so much more competently with unstructured data that might have previously been thought of as out of reach in terms of the work handled by the data science and software application development teams.”

Likey the lists

Harv Bhela, chief product officer took over from Kurian. Keen to follow on his CEO’s love of number-based talking points (Kurian likes to start his soundbites with enumerated highlights to engage the listener, or perhaps he just likes ordinal classification and separation) Bhela detailed NetApp’s five core data storage areas of work focus: unified data storage; cloud storage; anti-ransomware; AI (and the challenge of scaling AI toward enterprise workloads incorporating the use of unstructured data) and, finally, also a drive to deliver product and service consumption with ease.

As NetApp now builds AI functions inside its storage platform, Bhela says that developers will be able to work to use data as a competitive advantage in new ways. NetApp has one unified architecture that it delivers across core, datacentre and cloud and the company also delivers additional functionality across that platform with features focused on anti-ransomware and more.

Eminently interoperable unified data 

Taking over presenting responsibilities, Sandeep Singh, GM for enterprise storage spoke about how data can be made “AI-ready” and that’s a key part of how the NetApp unified data storage portfolio is now working to be eminently interoperable (a key function to help eradicate data silos) and also make sure it offers products that offer management consistency across NAS and SAN environments in a way that overcomes where any skills gaps may exist. 

As well as focusing on delivering new products aimed at modernising legacy disk storage, NetApp’s Singh said that with firms are typically working with somewhere around 60 percent “cold data” (seldom used) in their data fabric, so intelligent data storage that is costed out properly is key.

A disaggregated data storage architecture is designed to enable large language models to work in more efficient ways. 

CloudOps, set to change

Haiyan Song, EVP for CloudOps business took the stage to explain how the company is now developing intelligence services for infrastructure and workloads, she noted that CloudOps has been a standalone business for NetApp until now, but now it is being developed in tandem with the company’s storage platform technologies. 

A new service introduced here is NetApp Advisor for ASA, which is about real-time topology for SAN. This is all about optimising data structure insights with intelligence and AI services. NetApp says it is bringing generative AI services to wherever an organisation’s data resources are located. There’s also a lot of work happening here focused on open source technologies (much of the effort here will inevitably be focused on licensing management) and because enterprises typically use multiple open source solutions, NetApp has architected for this distribution of tools in terms of the way it provides data storage delivery abilities. 

Work here includes technologies designed to help customers optimise their virtual machine deployments. Going forward, NetApp CloudOps will be known as NetApp intelligence services for infrastructure and workloads.

Taking the lead for the next section of this day zero day, Pravjit Tiwana, GM for cloud storage said that what really excites the company is the global ability to work with all three major hyperscalers, obviously that’s Google Cloud Platform, AWS and Microsoft Azure.

“The array of different workloads that customers want to run these days is so diverse, so we aim to be the platform of choice below every use case whether it is in streaming, transactional applications, games development and so on,” said Tiwana. “We want to bring the AI stack to the data store however it is used and wherever it is used. The NetApp generative AI toolkits for Azure and GCP clouds can be used to train models and develop AI for future innovation. The whole area of AI is one that we will now focus on a lot, especially in environments where we need to work with unstructured data. With cost optimisation and price-performance always an ongoing concern, NetApp will continue to highlight our capabilities in these areas across workloads, across hypervisors.”

Tiwana also spoke about NetApp’s focus on operations, security and availability as primary areas of competencies before giving way to the next speaker, Krish Vitaldevara, GM for the NetApp platform.

NetApp ONTAP storage 

Vitaldevara detailed partner work including new developments with Nvidia. In specific terms, NetApp has begun the Nvidia certification process of NetApp ONTAP storage on the AFF A90 platform with NVIDIA DGX SuperPOD AI infrastructure. This a move designed to help organisations to use leading data management capabilities for their largest AI projects. 

Hoseb Dermanilian, global head of AI go-to-market & sales rounded out the discussion in this zero-day session. Dermanilian’s session was snappy and on message, with an over-arching theme of development designed to ensure that – as NetApp AI services are rolled out – in a way that at no stage creates any additional data silos.

Sustainable storage

It’s always good to finish with a solid note on sustainability and this a particularly pertinent topic in data storage, not least because of our increasing need for datacenter services in the age of artificial intelligence. 

Given the use of storage services in the context of this whole story… and the fact that some 60 percent of storage might be cold storage, NetApp says it is working to not only apply intelligence to data to service AI functions that filter upwards to the user level, is is, crucially, also working to make sure AI is embedded at the platform layer so that storage is stored in the most intelligent, cost-efficient and therefore sustainable way possible.

CIO
Security
Networking
Data Center
Data Management
Close