SNIA Experts on Data

How AI is Challenging Infrastructure

SNIA Episode 7

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 26:25

AI is upon us, and enterprises today are looking at leveraging its power while addressing the challenges it creates. In this episode, SNIA expert David McIntyre discusses how organizations are building out AI solutions from an infrastructure standpoint. He details real-world concerns around security, data privacy, and the balancing act of managing AI performance requirements with costly power consumption that may not align with organizations’ sustainability goals. Tune in for key considerations for a data center infrastructure optimized to support AI.

SNIA is an industry organization that develops global standards and delivers vendor-neutral education on technologies related to data.  In these interviews, SNIA experts on data cover a wide range of topics on both established and emerging technologies.

About SNIA:

Speaker 1

Welcome to the SNEA on Data Podcast. Each episode highlights key technologies related to handling and optimizing data.

Speaker 2

Hello, I am Mark Brown, your host. My guest is David McIntyre, who's a member of the SNEA organization and he works at Samsung in planning and solutions. Enablement, I believe in the data sector. Is that correct, david?

Speaker 3

That's very correct. That's right. Yep. So I have the opportunity and the exciting opportunity to focus on applications like AI, which we'll talk about here, and, coming from the software and hardware side, how do we build out solutions from an infrastructure standpoint that support these exciting applications in AI and then at SNEA. So I'm on the board at SNEA I chair a number of groups there as well, where we do dive into this topic on infrastructure management, compute, memory, storage, networking resources and also some of the real world concerns, including security, sustainability, and we can talk about some of those topics as well during our chat here. Sure.

Speaker 2

So, from a sort of a top-down standpoint, can you provide an overview of, say like, what's the current state of AI adoption within the IT sector? It seems to be new. It's all a buzz in the media, but what's it like on the ground in terms of AI adoption for the business sector in IT?

Speaker 3

Yeah, so I think we're bridging over the hype curve, where that was the only thing that was being talked about just maybe a couple of months ago and just the hype growth was. In my career, I've never seen something grow so quickly. However, I do see now that companies are really saying, okay, we need to address this market. It's a huge serviceable, available market. But how do we do so? And so some of the practical challenges, especially in enterprise, around security implementation, like the protection and privacy of data, and then how to optimize our infrastructures from a data center standpoint. That's one consideration, and then, yeah, we'll start there. I've got others, but yeah.

Speaker 2

Yeah, so what are some of the impactful things that are coming from a security standpoint with AI that you see happening for companies and businesses?

Speaker 3

So from security, it's ensuring that, when moving the data from storage to compute resources, that and caching that data in memory, large language models, or even now we're seeing a trend towards smaller language models, so more distributed architectures how is that data readily protected?

Speaker 3

There are many open source tools that are supporting AI today, but to get a win in a data center running enterprise applications, that has to be meeting the stringent requirements of ISOs and other regulations that are delivered to best support enterprise security.

Speaker 3

So I think there's many, many POCs and actually obvious deployments from a GPU standpoint in the data center now. But I think the security communities are very cognizant and cautious to protect their clients and the customers' privacy and security of the data that is computed upon. In fact, snia, we do have a security task force that focuses just on these types of challenges, and so we have security experts that, from a volunteer perspective, are some of the leading development pundits in security across the overall ecosystem, and so they're bringing in this caution that I'm referring to. That if you're moving the data, if you're storing or computing upon the data, how does it remain secure and also private as required? So AI provides increasing challenges there just because of the performance demands, but that doesn't mean it's an exception to these sorts of requirements, and that's what SNEA and our security group is looking at today.

Speaker 2

Okay, so in terms, of are there any examples that we can use to make a decision? Are there any examples that we can share of AI driven automation that have been streamlined or that we can call out, or you know, in terms of what would typically for say, for SNEA members what would typically be the main concerns that they should be addressing if they're moving into that sort of AI realm for IT?

Speaker 3

So where the data is collected and computed upon can be either in the data center, in cloud, on-prem or at the edge, and so having specific architectures that address those scenarios and diving into the security aspects of those architectures is quite important. And you know, computational storage is one example. So SNEA released a computational storage 1.0 spec and has also provided a reference API, which is a platform that allows the software developers to engage with the underlying hardware environment and so that, from a security standpoint, you know, as I mentioned, our leaders there are looking at real-world use cases, from data analytics to including video and large language models as well, on how to best address security across both software and hardware and share those insights with the overall community.

Speaker 2

Okay, so, in terms of the sorry, I'll start that question again as AI is evolving, do you see a stark learning curve for the IT sector in terms of that, in terms of adoption, or is it? Or, from your perspective, from your experience, are we on top of it or is it like speaking to someone else? Regarding financial transactions, some of the IT infrastructure that's on board isn't quite up to par. Is that the case with AI, or are we in a better position?

Speaker 3

So having a very strong expertise in our SNA community on traditional compute, memory and storage resources does give us that foundation.

Speaker 3

However, ai there are aspects of AI which are fundamental and need to be addressed, but it is also a new application layer, especially the applications themselves that are part of the AI, they're part of the AI universe that need to be considered.

Speaker 3

So one of the challenges from an IT perspective is that, whereas before we were focusing on centralized compute CPUs, now we have highly tuned accelerator technologies, whether it's GPU, tpu or FPGA-based accelerator technologies a number of options there that each have their own onboarding challenges to bring into the data center. Yeah, and also there's a significant requirement to balance the performance aspects along with the power consumption aspects, which we could actually talk about here. That I would say it's one of the sleeping elephants in the room that power consumption across hundreds and hundreds of kilowatts, which add up to megawatts of power consumption, is a very significant problem that needs to be tackled by the IT community and SNA is also looking to that, where, to the point of leading sustainability topics that we need to consider performance per watt as a primary metric when we are deploying and recommending such new architectures.

Speaker 2

Okay, that's a big topic, especially in terms of Agenda 2030. There's a lot of moves by a lot of companies and governments in terms of the policies towards environmental sustainability. Is that a big problem on the horizon or is it something that is being evolved to and do you think it's something that we can for any member of SNEA? Is that something that should be a primary concern or, if it is, who should they be talking to within their own environments and how we advise them to do that?

Speaker 3

Yeah, so it's an addressable concern, but it's certainly prioritized in the industry, whereas you mentioned 2030 and net zero carbon and it's quite ironic, I suppose, that sustainability has been growing and getting more and more visibility over the last few years, and now AI comes in as the most challenging use case, I suppose, to address sustainability, and so I've had the opportunity to speak on panels on this topic that there's a current carrot and stick situation going on, where the government is very adamant that data centers need to cool down their power consumption, and cool down is a hardware deployment you can put immersion cooling and different technologies to improve the efficiencies of the infrastructure but what they're really saying is that if you're not complying with certain sustainability standards in time, you'll actually get penalized for not supporting that.

Speaker 3

So that's the stick. The carrot side is embracing these more optimum architectures that best support artificial intelligence, and that's what SNEA because we have our member companies are leading the way in this regard. We have a very open, safe collaboration environment for our existing members and new members that want to address these very important topics and, either through reference architectures or discussions or specifications, come up with innovations that can support and maintain both performance while providing sustainability, and I would say that's the carrot Right.

Speaker 2

So in terms of that, then, would you say that an ALA adoption in the IT sector has workforce dynamic implications? Are there new roles and responsibilities emerging, or is it a case of they have to be created and you've got to fight for them? I'm just wondering what you see happening in the industry at the moment.

Speaker 3

Yeah, there's two things going on. I can think of. One would be just the overall surge and scale-outs of data centers. So right now, from a macro market condition, budgets are still under caution, but certainly there's a strong forcing function in AI deployments that are saying it's time to scale out these data centers and the companies that are supporting GPUs and TPU, the hyperscaler, supporting their own accelerator technologies. That's ongoing works. So from a resource standpoint, human resource standpoint having the expertise and understanding both the hardware and software level is a new evolving requirement there.

Speaker 2

So what resources are available for IT professionals looking to stay updated and get more au-fait with AI adoption in their industry, as it's growing? It is a trend, but it's also an evolving technology in itself, so people would be looking at implementing it and also staying on top of it to stay relevant. How do you stay updated? How do you connect? Obviously, being a member of SNEA is important, but what would you recommend they look at in terms of what's coming down the path in terms of technologies or standards and so forth?

Speaker 3

Yeah. So IT managers at data centers are, rightly so, quite conservative in deploying new technologies. They'll put them in a corner and trial them out, but they are new technologies so they need to have training and a better understanding. This comes into the partnership and collaboration between the companies that are providing these new technologies to the IT manager within the data center. So having that more advisory sales process where, whether it's a GPU company, a PGA company or any card-based accelerator company, I would suggest the onus is on that company to provide the latest trends and application-delivered performance on the hardware that they're proposing to the IT vendor or IT manager to best utilize their very limited resources.

Speaker 3

And then, from the SNEA perspective because there is that the community in sharing of these latest technologies I would certainly recommend that the IT managers embrace SNEA as a place to learn. In fact, one of SNEA's principal charters is to educate, across the hardware infrastructure, the application developers and the end users, which in this case would be can be represented by the IT managers. The other thing we did at SNEA just recently is we rebranded the organization, so we put SNEA as the experts in data and we put ourselves at the hub of that, because we do have this collective community of resources from different aspects of industry, from software, from hardware, we're able to provide that common view across all these sorts of technology options, whereas there are different communities that focus on maybe just interface technology or just storage technology. We're at SNEA now. We've consciously identified our organization as a hub and a very useful and relevant resource for both existing and new members to join us and learn about these challenges in AI and sustainability and other compelling applications and challenges in the industry today.

Speaker 2

So that would lead me to sort of the cool question I wanted to ask is from your perspective, ai is exciting and to kind of bring a little bit of the excitement today, so what do you see evolving over the next few years when it comes to AI and IT environments? What's your prediction or what do you see as a trend happening for the overall industry?

Sustainability and Performance in Data Centers

Speaker 3

Surely. So. You know, our theme in this talk has been around sustainability and performance married together, and I think that goes hand in hand with how these architectures are deployed. So today we're looking at data center centralized data center deployments, both in the cloud and now on-prem, but certainly as we go, say, into autonomous driving and other AI driven applications that lend themselves to more of a distributed architectures. That's going to be a very significant trend.

Speaker 3

So where large language models certainly have their role, they're out and deployed. Today I think you'll see smaller footprint language models that are very well tuned to the applications that they're supporting, and that provides some liberty on the selection of compute, memory storage and networking resources for those distributed applications. So think of it from a field perspective. Whether it's doctor's offices or intelligent traffic management systems, fleet dispatch, genomic sequencing, where you're capturing data in real time, the suggestion here is that you don't have to send all that data to the cloud. We can actually deploy data centric resources in computational storage, in intelligent and heterogeneous compute, where the data resides in a distributed architecture versus more of a centralized approach. So I do see that as a evolving demand and trend.

Speaker 2

Excellent and, from that perspective, is there anything that you could, even from a SNEA membership point of view, say what you could focus on to prepare yourself for that? Or even if you're just someone as a member who's joined, stepping into the AOA realm or in the OIT sector, you're trying to get a especially for project managers or even just for people stepping into projects what should they be watching for or what's the biggest thing they should focus on in order to be successful?

Speaker 3

So SNEA members and new members and those that are just considering and want to learn more about SNEA does I think we would all gravitate to. Each of us would gravitate to our area of expertise as a default, and we have many, many technical working groups and special interest groups across SNEA that can accommodate that each person's expertise. But I would suggest especially we have a new members and all members meeting coming up in January that then you can kind of look at the buffet of what SNEA offers. Honestly, I just onboarded onto sustainability, maybe a year ago, maybe, maybe less, probably about a year, and so that was an opportunity that I wasn't aware of, that SNEA was supporting, and artificial intelligence that is getting embraced across many of the SNEA member groups from different perspectives. So I would say, go in with a very open mind and appreciate that it's because we do focus on education as well as delivering actual specifications and reference architectures that it's an opportunity to expand your knowledge base and then also contribute in the area of expertise that you focus on.

Speaker 2

That's excellent advice and something to look forward to in January. I'd like to thank you, David, for that. Is there anything else that you would call out before we conclude this episode or any last points?

Speaker 3

I would say with AI, everybody is, as I mentioned at the beginning, jumping on the frenzy and maybe now what we're getting into is more of a realistic situation. Now, okay, we realize that AI is here to stay. So how does it apply to the end customers that we all serve? And really tuning into end customer requirements, from whether it's security, sustainability and certainly performance, optimized infrastructures that best support cost, all of these things that matter to not only the data center IT manager but to the end customers that those data centers are serving. So that's something that I always suggest is to focus on the applications, the use cases, and then see how all the wonderful things that we've done in technology supports the requirements of those use cases and applications, and I think SNEA is a great place to do so With this refreshed look on embracing the end applications and the use cases. It's an opportunity for our current members and new members.

Speaker 2

Absolutely. And David, just before you go, how do people stalk you online? Do you have a blog or you want to have your own podcast or website?

Speaker 3

So you can tackle me at LinkedIn. I don't have my own podcast, necessarily, but one thing I have started is when I attend and I've attended many, many shows through the years, through the year, and I speak at many of these events is that I'll try and give a synopsis across. And in fact I was at SC23 and SNEA had a wonderful booth where they were sharing a partner pavilion with CXL and many of the other organizations that are very pertinent to our industry today. So I try and give a. I call it. Did I save you a plain ticket by giving a two-minute summary on what happened at the event? And so, yeah, just check out my LinkedIn posts there. But certainly SNEA continues to deliver webinars on an ongoing schedule on these sorts of topics. So I would just check the SNEA calendar because there's so much content coming out from our organization that can just better serve and educate our communities.

Speaker 2

Super David, I'd like to thank you very much for taking part in this interview. It's been exciting. I hope to have you on again because I'm sure this is very innovative and it's also like founding technology, so I'm sure that there will be developments over time that you can bring to us, and really appreciate your time for that.

Speaker 3

Same here, Mark. Thank you very much.

Speaker 2

Take it easy.

Speaker 3

Okay, you too Bye-bye.

Speaker 1

Thank you for listening. For additional information on the material presented in this podcast, be sure and check out our educational library at sniaorg slash library.