UPDATED 12:32 EDT / SEPTEMBER 19 2019

AI

Q&A: High-performance flash is key to unlocking data-intense AI workloads

Artificial-intelligence workloads are intrinsically data-intensive. They need massive amounts of information to train models, running infinite scenarios for accurate predictions. Thus, storage becomes a key consideration for running powerful AI applications, able to run where the data resides, even across varied and distributed computing environments.

Flash storage sees multi-dimensional performance that can take any size of file or workload and run through it without creating storage-related bottlenecks. Two IT leaders in high-performance computing, Nvidia Corp. and Pure Storage Inc., saw these demands from customers. In response, they joined forces to create AIRI, an AI-ready infrastructure that can help unlock data intelligence.

“You know, a lot of it comes from our customers,” said Charlie Boyle (pictured right), vice president and general manager of DGX Systems at Nvidia. “That’s how we first started with Pure. It’s our joint customer saying we need this stuff to work really fast. They’re making a massive investment with us in computing. And so if you’re going to run those systems at 100%, you need storage that can feed them. If the customer has data, we want it to be as simple as possible for them to run AI.”

Boyle and Brian Schwarz (pictured left), vice president of product management at Pure Storage, spoke with Dave Vellante (@dvellante) and Lisa Martin (@LisaMartinTV), co-hosts of theCUBE, SiliconANGLE Media’s mobile livestreaming studio, during the Pure//Accelerate event in Austin, Texas. They discussed the Nvidia and Pure Storage partnership, the adoption of AIRI, and AI advancements in the industry (see the full interview with transcript here). (* Disclosure below.)

[Editor’s note: The following answers have been condensed for clarity.]

Martin: Give us an overview of where Pure and Nvidia are.

Schwarz: It really was born out of work with mutual customers. We brought out the FlashBlade product. Obviously, Nvidia was in the market with DGXs for AI, and we really started to see overlap in a bunch of initial [AI] deployments. So that’s really kind of where the partnership was born. And, obviously, the AI data hub is the piece that we really talked about at this year’s Accelerate.  

Martin: Tell us about the adoption [of AIRI] and what customers are able to do with this AI-ready infrastructure? 

Boyle: [Early customers] had been using storage for years, and AI was kind of new to them, and they needed that recipe. So the early customer experiences turned into AIRI the solution. And the whole point of it is to simplify AI. 

AI sounds kind of scary to a lot of folks, and the data scientists really just need to be productive. They don’t care about infrastructure, but IT has to support this. So IT was very familiar with Pure Storage. They used them for years for high-performance data, and as they brought in the Nvidia compute to work with that, having a solution that we both supported was super important to the IT practitioners. 

Vellante: How do you see the landscape? Are you seeing pretty aggressive adoption … or is it still early?

Boyle: So, every customer is at a different point. There’s definitely a lot of people that are still early, but we’ve seen a lot of production use cases. So depending on the industry, it really depends on where people are in the maturity curve. But really our message out to the enterprise is start now, whether you’ve got one data scientist or you’ve got some community data scientists, there’s no reason to wait on AI.

Vellante: What are the key considerations for getting started?

Schwarz: I think understanding the business value creation problem is a really important step. And many people go through an early stage of experimentation — a prototyping stage before they go into a mass-production use case. It’s a very classic IT adoption curve. 

If you look forward over the next 15 to 20 years, there’s a massive amount of AI coming, and it is a new form of computing the GPU-driven computing. And the whole point about AIRI is getting the ingredients right to have this new set of infrastructure have storage, network, compute, and the software stack.

Martin: For other customers in different industries, how do you help them even understand the AI pipeline?

Boyle: A lot of it is understanding your data, and that’s where Pure and the [AI] data hub comes in. And then formulate a question like, what could I do if I knew this thing? Because that’s all about AI and deep learning. It’s coming up with insights that aren’t natural when you just stare at the data. How can the system understand what you want. And then what are the things that you didn’t expect to find that AI is showing you about your data. AI can unlock things that you may not have pondered yourself. 

And one of the biggest aha moments that I’ve seen in customers in the past year or so is just how quickly by using GPU computing they can actually look at their data, do something useful with it, and then move on to the next thing. So, that rapid experimentation is what AI is all about.

Vellante: You’re not going to help but run into [machine intelligence]; it’s going to be part of your everyday life. Your thoughts?

Boyle: We all use AI every day; you just don’t know it. It’s the voice recognition system, getting your answer right the first time … all in less than a second. Before that’d be like you talked to an IVR system, wait, then you go to an operator; and now people are getting such a better user experience out of AI-backed systems.

Vellante: The AI leaders … are applying machine intelligence to that data. How has this modern storage that we heard about this morning affected that customer’s abilities to really put data at their core?

Schwarz: I think one of the real opportunities, particularly with flash, is to consolidate data into a smaller number of larger, kind of islands of data, because that’s where you can really drive the insights. And historically in a disk-driven world, you would never try to consolidate your data, because there were too many bad performance implications of trying to do that. The difference with flash is there’s so much performance at the core of it, at the foundation of it.

Martin: I want to ask you about distributed environments. Customers have so much choice; on-prem, hosted, SaaS, public cloud. What trends are you seeing?

Schwarz: The first thing I always tell people is, where’s your data gravity? Moving very large sets of data is actually still a hard challenge today. So running your AI where your data is being generated is a good first principle. The second thing is about giving people flexibility. So trying to use a consistent set of infrastructure and software and tooling that allows people to migrate and change over time is an important strategy.

Vellante: So, ideally, on-prem versus cloud implementations shouldn’t be different … but are they today?

Boyle: At the lowest level, there are always technical differences, but at the layer that customers are using it, we run one software stack, no matter where you’re running. It’s the same in the Nvidia software stack. And it’s really [about] running AI where your data is. 

Vellante: Now that you’ve been in the market for a while, what are Pure’s competitive differentiators?

Schwarz: Why do we think [flash] is a great fit for an AI use case? One is the flexibility of the performance — we call it multi-dimensional performance. Small files, large files, meta data-intensive workloads, FlashBlade can do them all. It’s a ground-up design, it’s super flexible on performance, but also, more importantly, I would argue simplicity is a real hallmark of who we are.

Watch the complete video interview below, and be sure to check out more of SiliconANGLE’s and theCUBE’s coverage of the Pure//Accelerate event. (* Disclosure: TheCUBE is a paid media partner for the Pure//Accelerate event. Neither Pure Storage Inc., the sponsor for theCUBE’s event coverage, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU