AI Proving Ground Podcast: Exploring Artificial Intelligence & Enterprise AI with World Wide Technology

AI Doesn’t Break Where You Think: How Storage and Data Readiness Now Decide AI Success

World Wide Technology: Artificial Intelligence Experts Season 1 Episode 76

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 28:25

AI is working. Until it isn’t.

Not because the model failed, but because something behind it did.

In this episode, recorded live at NVIDIA GTC, WWT’s Mike Trojecki and Everpure’s Kaycee Lai talk through what starts to show up once AI moves into real use. Data that’s harder to access than expected. Systems that slow things down. Security and governance questions that weren’t part of the original plan.

It’s a different kind of conversation than most. Less about what AI can do, more about what it takes to actually run it inside a business.

There’s a shift happening underneath it all. AI is starting to behave like a core application, which means it inherits the same expectations around reliability, performance and control.

And that’s usually where things get complicated.

More about this week's guests:

Mike Trojecki brings 25+ years of experience across security, cloud, and AI, starting with his service in the U.S. Air Force supporting White House and Air Force One missions. He’s led emerging tech practices at top integrators and now leads the AI Practice at World Wide Technology, driving high-performance AI infrastructure and data strategies.

Kaycee Lai is a data and AI leader with 20+ years of experience driving innovation and enterprise transformation. At Pure Storage, he leads Enterprise AI and Analytics strategy. As founder and former CEO of Promethium, he pioneered Data Fabric, helping organizations unlock more accessible, usable data.

The AI Proving Ground Podcast leverages the deep AI technical and business expertise from within World Wide Technology's one-of-a-kind AI Proving Ground, which provides unrivaled access to the world's leading AI technologies. This unique lab environment accelerates your ability to learn about, test, train and implement AI solutions. 

Learn more about WWT's AI Proving Ground.

The AI Proving Ground is a composable lab environment that features the latest high-performance infrastructure and reference architectures from the world's leading AI companies, such as NVIDIA, Cisco, Dell, F5, AMD, Intel and others.

Developed within our Advanced Technology Center (ATC), this one-of-a-kind lab environment empowers IT teams to evaluate and test AI infrastructure, software and solutions for efficacy, scalability and flexibility — all under one roof. The AI Proving Ground provides visibility into data flows across the entire development pipeline, enabling more informed decision-making while safeguarding production environments. 

What GTC Didn’t Say Out Loud

SPEAKER_00

From Worldwide Technology, this is the AI Proving Ground Podcast. Today's show was recorded live on the show floor at NVIDIA GTC, where we caught up with WWT A VP of AI, Mike Trajeki, and ever peer VP of AI, Casey Lai, about what enterprise leaders need to do now. Align agentec AI ambition with secure data access, storage performance, infrastructure readiness, and governance that can actually hold up in production. Mike and Casey are exactly the kind of voices we want in this conversation. Leaders who understand what happens when AI moves from concept to production and the real questions shift to data, infrastructure, scale, and trust. It's a conversation about the real bottleneck in enterprise AI and what it takes to break through. So let's jump in. Okay, well, Mike Casey, welcome to the AI Proving Ground podcast here at NVIDIA GTC. How are the two of you doing? Fantastic. Thanks for having us. Doing awesome. A little tired, a lot of meetings, but still great. Oh, yeah, it's meeting central for sure. Well, I mean, GTC about as good a time as any to take inventory, you know, based on the conversation that we're hearing from the keynote or just here on the floor, lots about AI factories, lots about inference, agents, infrastructure. I mean, the list goes on. Casey, we start with you. I mean, what are you seeing and hearing right now? What what has caught your attention and maybe, you know, what is what is maybe misrepresented or under focused?

SPEAKER_02

Yeah, good, good question. So I think the the cool part is I think we've all gotten past the initial shock of what AI can do. Yeah. And so now we're everyone is really focused on putting into production, right? And so that's a big theme is how do you actually get this into production? And that's actually where I think a lot some of the misrepresent, not misrepresent underrepresentation, I should say, or under focus comes in. Because once you start dealing with production, especially the enterprise, things like security, things like access control, things like stability, low latency, that starts coming to play. And I think this is where enterprise storage, where initially didn't get a lot of focus, is now actually getting a lot of focus because AI is no longer an afterthought. If you're going to treat it and put it into production, you have to treat it the same way you do your tier one applications, right? So I think this is where you're starting to see a lot more conversations now. We're just meeting, I just came from a meeting with the NVIDIA folks all around CMX and STX. So that's really NVIDIA's recognition that, hey, we need to make sure that the storage piece is fully integrated and it's not an afterthought.

SPEAKER_00

Yeah. I mean, if it's no longer an afterthought, Mike, what what do leaders need to be taking away just from kind of what the signal is here at GTC to to put into action today or you know, next week?

SPEAKER_01

Yeah, and I'm still amazed by what all this stuff is doing. Our jobs change every three weeks. Two, two. I don't know about you, but it's two for me. Well, people are still doing the storage thing, so but so it's not three weeks. So look, really, what the thing is that I'm finding here, uh, especially in you know, going back to the storage piece, it's now becoming a data conversation instead of a storage conversation. And to me, that's going to accelerate everything that we do in AI. And it is, it is the there are two critical components. I mean, everybody knows you know, GPUs and kind of the processing piece, but the storage piece and the security piece are the two critical components, and I'm seeing a lot of focus here. So if I was you know a leader coming out of GTC, I'd think, how do I make sure that whatever I'm deploying from an AI standpoint is secure? And how do I actually make sure that that data is in the right condition and that data is doing what you need it to do? That to me is what people should be thinking about.

unknown

Yeah.

AI Is Now About Outcomes

SPEAKER_00

Casey, build on what Mike said there. You know, he's talking about it's no longer about just storage. It's you know, it's about you know where the data is, how we have access to it. Maybe just a little bit about why it's important for the enterprise.

SPEAKER_02

It's important because I'll kind of take what Mike said and go kind of one step further. I think with AI, it's become so easy and so fast to build that I think the conversation is now shifted to more about outcomes. Yeah. More about business outcomes, right? What you can do. So it's gone from storage to data to now outcomes. And that's actually the right way to think about it, right? Because we don't do AI because it's fun. We're doing it because we want to drive some change and drive some outcomes. So I think this is where when you start thinking about outcomes, what do you start thinking about? You start thinking about predictability. Yeah. We start thinking about like, hey, is this being used by the right person? You start thinking about customer experience. Uh, and this is where the storage piece actually becomes super critical. Like, for example, when everyone could deploy Chat GPT on some application, it's actually pretty hard if you say, I want all 300,000 users to be able to concurrently do prompts in production. That's actually not as easy as you think, right? The all the security access controls and then just the latency you have to deal with. And that's where the, you know, I think considerations around storage for access actually becomes a pretty big deal, right? That people aren't really thinking about that. Today, as they're moving into production, they're now hit big hit with a lot of failed co-pilot projects. Right. Now the people are really thinking about it.

SPEAKER_00

Yeah. I mean, Mike, you you're having conversations all the time, many of which are happening here at GTC, but you're out meeting with clients and organizations of all kinds. Is that message that Casey's talking about resonating with them or are they playing a little bit of ketchup still?

SPEAKER_01

I I don't think they're playing catch up. I think they understand organizationally that they need to focus on outcomes too. And it's not necessarily about the technology itself. So all of these organizations, when you know, we're looking at whether it's in manufacturing or in retail, you've got to actually think about how that's going to impact the business. And yeah, I don't want to say the technology is secondary because the technology is extremely important, but focusing on what you want to get out of it. What do you want it to do? How do you want it to impact the organization? And is it going to be scalable for you? Is it going to be reliable? Is it manageable in your environment? So if you focus on those things, I think exactly what Casey said, the outcome piece is the conversation. And we've been talking about outcomes in this industry for you know 20 years. I think now is really the time where it's top of mind for everybody because this isn't AI is not a technology. AI is a revolution of things that we're doing right now. And there's just a number of technologies that are supporting that.

SPEAKER_00

Yeah. Case, I mean, in your experience, are organizations over pivoting on the compute and infrastructure side and not paying as much attention to storage, or is that starting to even itself out? And what is the right balance?

Most Infrastructure Isn’t Ready

SPEAKER_02

Yeah. So I think in the beginning, it's very easy because, you know, like a big consulting company might talk to like the CEO, the chairman, like you need AI. Yeah. And they're like, they come back to their team, like, we need AI, right? So who do we talk to? Well, let's go talk to the NVIDIA guys. And once you start there, like, oh, well, who's who's my preferred server vendor? And so the conversation really starts all around compute, right? And GPUs. Uh, and so that's where the focus really is. And quite honestly, the reason why they don't kind of think about the infrastructure layer, like the networking storage, is most organizations actually don't know what AI use cases they want. That from an enterprise perspective. I'm not talking about AI labs, I'm not talking about new uh neo clouds, but we talk to a lot of enterprise customers and we say, okay, what are the outcomes you want to drive? They actually have to pause a second and say, it's a good question. We know we need AI. And so, you know, back to our point, if you don't know what the outcomes are actually going to drive, you're not thinking about things like, gosh, I have 56,000 employees. If they all start doing prompt engineering and vibe coding and building their apps at the same time, accessing production data, can I actually sustain that? Like, is my infrastructure actually ready for AI? So a lot of organizations they see they're AI ready, but they don't know that their infrastructure is actually not AI ready. And the speed and scale in which AI kind of shines a light on any cracks you have, uh, it gets exposed really, really fast. So I think we've gotten past that part. And so, and that's why like the meeting I just came from with the NVIDIA team is really happy to hear like even NVIDIA recognizes this and they said, Look, storage and infrastructure cannot be an afterthought anymore. Like, it's not good enough to say, here's your reference architecture, choose your own adventure, go ahead. They've heard enough from customers that great co-pilot, great experiment, but the minute we go into production, it's gotta be solid, right? And so now, now's the time, it's great signal from NVIDIA saying, no, no, no, this has to go out together. We can't do it unless we go together, right, with our tier one storage partners. So I think this is where people are now getting to that point. Now, there's always different AI maturity curves. I'm not saying everyone has there's still some folks who are still just trying out Chat GPT. Sure. So until you get to that point where you've gone from experiment and you're ready to go to production, you're not gonna see these pain points. But the good news is enough customers have done that. They're saying they didn't they need they know need to they know they need to make a change.

SPEAKER_01

And then I just want to ask you a question. So you mentioned you know looking at you know the enterprise side of things for the last couple of years, enterprise is ready for AI, the enterprise is ready for AI. It actually seems like that's now the case. Yes. And as we build these architectures and we look at AIDP and things like data stream, I mean the enterprise is poised for success. Big time. And going back to what you were saying about you know the use cases and outcomes, if we collectively can help them figure those out, we can help enterprises advance, advance what they're trying to do, get a competitive edge, reduce costs, right? Whatever they're trying to do.

SPEAKER_02

Absolutely. So I think the one of the insights that I had talking to a lot of customers is they've been struggling with thinking that AI must be something new, that they have to come up with something new. And the aha moment for them was when I said, you already have existing business challenges, right? You're a retail company, right? You deal with e-commerce, you deal with retail, you deal with supply chain, you deal with demand forecasting, and they're like, Yeah, I do. Like, well, what if AI can make that faster, more accurate, easier, lower cost? So it's just a different way of thinking about it. It they're they're all thinking like there must be some radical new use case. A lot of times there isn't, but it can make what you're doing a heck of a lot better, a heck of a lot faster. And this is where I think things like AIDP and Data Stream is actually well poised for that because you don't have to start with a massive, massive footprint and purchase that the neo clouds and sovereign tiles actually do, right? You just start with the small single appliance, prove out the business case. It has all the NVIDIA models, it has the GPUs, fully integrated appliance. Great for an enterprise, but it's easy. It's all about easy, time to value, prove out the business case and then grow from there. So I think the technology, the form factor, the AI maturity is now starting to catch up. And the minute enterprises all realize, oh my gosh, I already have the existing use cases, I think this is where we're gonna see it's really starting to cook.

SPEAKER_01

Yeah. One of the things that, you know, we are WWT, so I'll give ourselves a pitch. In our proving ground, we're using that proving ground to be up to have people be able to go and test out and prove out those use cases without having to actually buy anything up front. So so you look at that, and I love that, you know, depending on the organization, you've got the option to start start small with that appliance. You also have the ability to go and put that in the AIPG and be able to do that there.

SPEAKER_02

Absolutely. I think I think that's a fantastic thing for customers, and we always recommend our customers to check that out. Look, AI should be about removing friction, right? AI should be about how do I see the impact as quickly as possible. So the more partners like WWT can you with Pure can actually do that, that's just fantastic. It helps helps the adoption for everybody.

SPEAKER_01

Yeah. There was so last night I was asked, you know, we won a couple of awards and video awards, and I was asked, you know, why do you keep winning this same year after year? And what I, you know, our CEO does a lot of podcasts with Brian, and he told us we were going to be an AI first company and we live it. So those use cases that you're talking about, those internal use cases, yeah, we're not we weren't trying to solve for anything new. We were trying to fix problems that exist in any organization. You know, when you look at supply chain stuff and you look at just communication internally, how we interact with each other. So that to me, that's what is we need to get our joint customers thinking about. And exactly what you said. I'm gonna I'm gonna I'm gonna steal that from you. By the way, away. I am it's it's a case, it's the case mapping. Copy and steal everything. So yeah, I'm gonna steal that. But that's that's exactly that couldn't be more on point.

SPEAKER_02

Yeah, and I think, you know, working with WWT, it's really our job to go educate and evangelize because again, it's not readily obvious, but once you hear it, once you see it, it actually becomes readily obvious. For example, every company has a sales team, right? Every company has a sales force. What do the sales force have to do? They have to find prospects. What is it that salespeople hate doing? Finding prospects, figuring out what their product is valuable, why their product is valuable to the prospect's pain point, and being able to justify that and then creating collateral. It's an AI use case. This is something that you can actually solve with AI and streamline in minutes, right? And you can standardize it. And now, from management perspective, you actually have much better visibility than the old traditional CRM. So even that in itself is a very simple use case, but it's actually practical to just about every organization.

SPEAKER_00

Yeah, I mean, if you're talking about tackling use cases that are, you know, current business challenges, does that does that do you think that makes it easier to then to or at least know where the data should be or what data you're looking out for? And therefore you can start to implement a more mature data strategy?

SPEAKER_02

Absolutely, because those use cases already have applications that exist. Those applications are already tied to their existing data. They already have policies that exist. So the great news is you've actually done some of that work already, right? In terms of access control, in terms of what the policies are, retention policy, protection policies. So instead of starting from scratch, right, which which I actually think is actually a bad idea because a lot of organizations put in infrastructure for AI that's not enterprise ready. They're trying to jam something in there, and then the minute they try to go production, IT now is burdened with this extra silo project that they have to deal with, shadow IT. Why do that? If AI is supposed to be core, part of your business, make it core. Like apply the same tier one, you know, governance, apply the same tier one data management services to it. So this is where I think focusing on existing use cases and applications you have to just make it better. Let's start there.

SPEAKER_00

Instant ROI, very easy to justify. Yeah, it's building momentum. I mean, at what point then do you feel comfortable enough to take a big swing or two? I mean, it's at what point do you start to see data maturity? How do you start to assess that? How would we advise a client to successful projects, right?

SPEAKER_01

You keep that flywheel going. Yeah. That's absolutely it. And like we we keep coming back to the use case and the outcome piece. And to keep that flywheel going, like ever pure, these guys have created an app that basically allows you per industry, based on who you're talking to, to have that customer identify what are the use cases that are the most common in that industry. And again, not something that has to be revolutionary, but it's a use case that gets mapped out. You know, you gave us a demo of that yesterday. It's phenomenal. It takes it takes some of the pressure off of an organization trying to figure out where to start. Exactly. And so that's it.

SPEAKER_02

That's a very basic thing that it's not revolutionary from AI perspective, but the impact that it has. I showed it to our sales team at our sales kickoff, and literally overnight I got hundreds of people already on it. I gotta give a plug. The tool is called Abby. My wife is Abigail. So I kind of it in it stands for AI Blueprints Built For You, but little play, but you know, gotta give prompts to the wife who cuts up with my crazy schedule, right?

SPEAKER_00

So it's an acronym, but that's not what you say at home. You say we named it after you.

SPEAKER_02

We I try to say whatever I can to get myself out of trouble, man.

SPEAKER_00

Well, I mean, earlier we talked about how the nature of storage is is shifting from just kind of location to more of a platform. What are you seeing here at GTC that's giving you confidence that that's that that that is moving in a direction that that we feel good about?

SPEAKER_02

Yeah, so if you look at the evolution, right? Um, you know, NVIDIA always started out with like their like certifications, right? Superpod, base pod, dgx, after and for the most part, they're all reference architectures and designs with CMX and also with STX, it's really locked down. I mean, NVIDIA is saying, like, no, no, no, like this can't be this choose your own adventure thing anymore. And the reason why we talk to their executives, like they've heard from their customers, like this was great, but then the complexity of actually implementing this in production, uh, and it always comes down to the storage layer. It always comes down to the data layer. Customers can't get access to it, customers don't have security, it's not fast enough, it's it's crapping out in production. If you can't feed from the storage into the GPUs quickly enough, right, you're gonna get an awful experience. If you can't process the data quickly enough, you're gonna get awful experience. If you can't do it in a way that's governed, no one can use it in production. So what I'm seeing now is this push towards STX is actually a great, great signal, right? Public signal from NVIDIA saying, like, look, we recognize this, and guys, we all need to work very, very closely and collaboratively because we're not gonna do this piecemeal thing anymore because we want success, right? This whole theme at NVIDIA GTC, it's all about putting AI into production, right? Yeah, it's not about the cool ideas anymore. I don't always come, but the cool ideas don't matter if no one actually uses it and get the success. And you can't take 18 months to get to success. This is the thing about AI, and I'm sure you've seen it, Mike, is people want fast success, right? And you can actually achieve it. And so get the use cases, but make our customers win, right, quickly.

SPEAKER_01

And enabling the data, but also enabling the developers and and your citizen developers within the organization, give them the latitude they need to experiment. Yes, give them a playground to go and develop and try and come up with these use cases. Again, it doesn't need to be the absolute cool, hey, I've built something new, but give them the ability to, hey, how would you actually change what you do in your job every day by using AI? Yeah. And that's a question that I I'm seeing here. A lot of the conversation I'm seeing is around that kind of conversation about how would I do this differently? How would I do my job differently with AI? And like you were saying, on now on the architectures being pretty pretty rigid from an architecture standpoint, it we know it's going to work. So there isn't going to be this situation of, all right, well, we want to do this, but the but the infrastructure, we're not sure what's going to happen. We're pretty sure we know what's going to happen at this point. Yeah. Yeah.

SPEAKER_02

You saw Asian claw, right? Uh right. Jensen talks about that and Nemo Claw. A big part of that is like actually privacy, security, and governance. So, I mean, that's a big, big nod that look, this is not about experimentation. We're past that. We know we can experiment, we know it can iterate quickly. It doesn't matter if that thing never makes it into production, right? Like, I actually know customers who spend tens of millions of dollars a year on AI and they can't use any of it in production, right? That's a fun experiment. It's an expensive experiment, right? It's for innovation, but at some point we gotta stop experimenting. And it's always these things that hold people up again in production. So excited to see like NVIDIA pushing claw, excited to see the STX thing, all that signals we're ready for a revol, we're ready for a change.

Speed Is the New Bottleneck

SPEAKER_00

Yeah, I mean, whether it's Asian claw, Nemo Claw, or you know, you're talking about citizen developers, you know, these coding assistants that are you know coming out, how does that change the equation from a storage perspective at all, if at all, or is it just, hey, remember these basics that we've been talking about?

SPEAKER_02

Yeah, so remember my example, right? One person doing a prompt, not a big deal, right? A hundred thousand people doing a prompt, pretty big deal. So in an in AI context, memory matters, right? If 100,000 people and they ask the same question over, you don't want the AI to keep recalculating and regenerating that question. That's a lot of tokens you're gonna be spending and burning up. And so you need actually persistent KV cash, right? A long consistent KV cash. That's where the storage comes in, right? Because you can't store all that in the GPU forever. At some point, it's got to flush down. But when it flushes down, you need some persistent layer. That's where we're seeing the storage layer, the KB cash innovation coming in. That's where some of our product lines where we're garanteed like that ultra, ultra low latency. That's the difference for a chief AI officer or chief data officer between failure and production. It's that's it, right? Anyone can succeed with 10 or 20 people. But to get the entire finance department, entire sales department, entire marketing department using it concurrently, which is really the goal of AI, right? You want that collaboration to go. So, this is one just a very specific example of why storage actually matters in that case.

SPEAKER_00

I mean, to get to where he he's talking about what other architectural shifts need to take place with you know the customers that we talk to, it just on the

SPEAKER_01

Your infrastructure side, the networking, getting time to first token. I mean, that's gonna that's gonna play a big piece into all of this as well. So when you look at it, it's you've got to feed that GPU. And every time you're missing, you know, a bit or have to restart a training process, you're not only you're using double the tokens or triple the tokens, you're also using power, cooling, whatever. And the costs there are rising exponentially as well. So I think that's one of the other shifts and the things that we've seen. And that's part of part of that new paradigm, that new architecture. So the infrastructure, security, storage, data, and obviously the GPU side of things.

Agents Are Becoming Employees

SPEAKER_00

Yeah. I mean, so much happening all the time. You're talking about your jobs changing every two weeks. I'm sure it's actually shorter than that. So you have to keep your eye on the ball all the time, but I'm sure you're also keeping your eye on the future. Where do you expect this conversation to progress over the course of 2026 or if we were at this table in 2027?

SPEAKER_02

Yeah, I mean, I think we're starting to see it now, right? Like we're seeing our jobs changing. And when I say us, I mean humans, yeah, right. But we're at the point where agentic is getting so darn good, right? That you know, code can write itself, right? And so applications can build themselves. Virtual employees can clone themselves and build virtual employees. So I think you know how we think about traditional working together with human beings, I think that's something we we're taking for granted right now. I do see that happening faster and faster. Like I actually have a few kind of virtual employees that I've done for to kind of just simplify basic tasks. But as the AI has gotten so much better, like in application development, it's not that far for me to foresee. I've got a bunch of you know, virtual employees who are developer app developers, SDRs, right? Even go to market folks. And so, how does that change with how we deal with security? Like there's no employment contract, right? I can't fire them. I can't dock their pay, right? And so their tokens.

SPEAKER_01

Yeah, how does that change? If you try and decommission them, they'll fight back. They'll fight back. That's when Skynet starts.

SPEAKER_02

Yes, they'll have a revolution, they'll get all the other bots to go against me, right? So, no, but it's all seriousness. I I think think about the scale that happens, right? Like, before you gotta open up a rec, you gotta go get the budget, you gotta go hire someone, you gotta train them. Imagine getting 30 virtual employees on at the same time. Like, how does that change HR policies? How does that change, you know, what we control? Like, what if they all create their own LinkedIn profiles? I mean, there's just you know, how do you control how they interact with people? I think that's actually not that far away, right? Like, I'm I'm actually seeing this, and it may not be 2026, but mark my words, that's not that far away. Then how do you control what they access internally?

SPEAKER_01

Yeah. And then what do you control when they become a physical device, right? So we talk from from a robotic standpoint. I mean, getting that information and getting those agents into a physical device to actually go physically do something, yeah, is kind of, I'm looking at that in the next couple of years saying that's kind of the next wave is taking the agentic model and moving it into the physical world.

SPEAKER_02

That's that's when Skynet builds the T800. Is that yeah?

If You’re Not Learning, You’re Behind

SPEAKER_00

I mean, it can go in so many different directions, right? Yeah, obviously, worst case scenario, Skynet. But maybe the better question is, and we can we can end on this because we're coming short on time, and you know, Mike, we can start with you, Casey. You can close us out. Is maybe the better question is what can we do now so that we're prepared to handle whatever it is that comes next. And in other words, that's just what are the couple action items that we can take away from this conversation or GTC in general to best position the enterprise for success?

SPEAKER_01

Learn, learn, learn, experiment, experiment, experiment. That is that has to be the case. If you are not familiar with AI and you are not losing using AI, you're gonna lose a battle. You are going to potentially not be the person who is picked to do a certain job. You've got to get to that point because virtual employees are real, they are real now. So continue to learn, learn, learn, and experiment. Yeah.

SPEAKER_02

Look, I think it's going to permeate from professional life into the personal life, right? So not only should you learn and experiment at work, you should be using AI just in your personal life at home. And there's there's really no reason, like if you're if you're living in America, there's no reason why you can't. We have accessibility, accessibility to like fast, cheap internet, right? There's so many things that we can do. We have robotaxis everywhere, right? Everyone's using chat GPT. I I really believe, and I want to I can't underscore enough what what Mike is saying. If if you don't get yourself to the mindset, because AI does change your mindset. The minute you start using AI, like my the way I work is very different than the way I used to work, where I spend my time, how I process it. There's certain things that I don't think about myself anymore because I know I can use AI to do that. I focus my my mind thinking about something else, that's going to permeate into your personal life. And if you don't, the reality is you're you're probably interacting with an AI agent in your personal life, whether you know it or not. When you call a bank or you call someone, and again, once robots start coming into play, that's also you're now interacting with AI. So if you don't interact with AI in your personal life and your professional life, I think you will be falling behind big time.

The Real Takeaway

SPEAKER_00

Yeah, yeah. Well, Casey, Mike, thank you for joining us here at GTC. I know your time is absolutely scarce. So I very much appreciate the partnership and for being on today. Thanks again. We'll see you hopefully next year. All right, thank you guys. Thanks, Ryan. Thanks, Casey. Okay, thanks to Mike and Casey for joining us today. The takeaway AI strategy moves fast, but outcomes only scale when the foundation is built to hold them. Production AI is not a model problem alone, it's an execution problem across data, storage, security, and scale. And your AI ambition means very little without operational readiness across each of those areas. This episode of the AI Proving Ground Podcast was co-produced by Nas Baker, Kara Kuhn, Sarah Chiadini, and Addison Ingler. Our audio and video engineers, John Knoblock. My name is Brian Felt. Thanks for listening. See you next time.

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

WWT Research & Insights Artwork

WWT Research & Insights

World Wide Technology
WWT Partner Spotlight Artwork

WWT Partner Spotlight

World Wide Technology
WWT Experts Artwork

WWT Experts

World Wide Technology
Meet the Chief Artwork

Meet the Chief

World Wide Technology