AI Proving Ground Podcast: Exploring Artificial Intelligence & Enterprise AI with World Wide Technology

NVIDIA Leaders on Building AI Factories and Driving AI Innovation

World Wide Technology: Artificial Intelligence Experts Season 1 Episode 49

Artificial intelligence is rewriting the rules of the global economy. Powered by leaps in accelerated computing, digital twins and autonomous systems, a new industrial era is emerging in real time. In this episode of the AI Proving Ground Podcast, NVIDIA leaders Jay Puri, Charlie Wuischpard and Craig Weinstein break down how the entire AI stack — from the silicon driving breakthroughs to the software shaping intelligent experiences — is redefining how we build, operate and compete. This conversation gives leaders a clear view of what's coming next—and what it takes to align their organizations, scale AI innovation, and stay ahead in a race where velocity, architecture, and ambition now determine advantage.

Editor's Note: This special episode of the AI Proving Ground Podcast was recorded during WWT's Business Innovation Summit, which took place at the PGA TOUR's World Wide Technology Championship in November 2025. 

The AI Proving Ground Podcast leverages the deep AI technical and business expertise from within World Wide Technology's one-of-a-kind AI Proving Ground, which provides unrivaled access to the world's leading AI technologies. This unique lab environment accelerates your ability to learn about, test, train and implement AI solutions.

Learn more about WWT's AI Proving Ground.

The AI Proving Ground is a composable lab environment that features the latest high-performance infrastructure and reference architectures from the world's leading AI companies, such as NVIDIA, Cisco, Dell, F5, AMD, Intel and others.

Developed within our Advanced Technology Center (ATC), this one-of-a-kind lab environment empowers IT teams to evaluate and test AI infrastructure, software and solutions for efficacy, scalability and flexibility — all under one roof. The AI Proving Ground provides visibility into data flows across the entire development pipeline, enabling more informed decision-making while safeguarding production environments.

SPEAKER_00:

From Worldwide Technology, this is the AI Proving Ground Podcast. The race to build AI factories is on. And if you listen closely to the people building this future, a pattern emerges. These aren't factories in the old sense, they're national assets. So today, we're sharing an executive-level conversation between WWT Executive Vice President Matt Horner and a trio of executives from NVIDIA, Jay Purry, Charlie Wishport, and Craig Weinstein, who make it clear that decisions about where data lives, who trains the models, and how quickly you can move from idea to first train aren't just IT conversations. They're economic strategy and a matter of competitive survival. This conversation took place just a few weeks ago at WWT's Business Innovation Summit, which curated some of the most engaging and influential minds in business and technology today, and equipped leaders with the clarity, conviction, and playbook to compete and win in an AI-defined future. This discussion is an opportunity to learn what the people at the center of this revolution see coming next: the geopolitical tensions, the exploding demand for compute and energy, the rise of sovereign AI, and the new architecture required to build and run AI at scale. It's a window into how nations, industries, and enterprises will be shaped in the decade ahead. So let's jump in.

SPEAKER_03:

Yeah, Jay.

unknown:

Yes.

SPEAKER_03:

There is a little murmur across the um property here because this is back-to-back for Jay. So we're gonna really keep an eye on him next year. But great to have you back. Thank you. And really appreciate you being here. Craig, also appreciate you being here. And uh it's it's great to work with you from a partnership perspective. And Charlie, you know, we've had a lot of time together as as we grow our business. Just a interesting fun fact from a couple different angles. Uh 2024, WWT was about a billion-dollar partner with NVIDIA. In 2025, we'll be over a$4 billion partner with NVIDIA. Um, in some circles, you'd say impressive. Uh Jim says it's only 2.5% of their market share. So uh we've got we've got more work to do. But we are doing a lot with NVIDIA, and we're excited to have them as a partner. Um, last year at this time, we did have Jensen as our guest at the Business Innovation Summit. And really over the last year, I'm sure many of us have seen him with presidents and world leaders and innovators and heads of state and heads of businesses. And I'm just curious how often does he reflect on really the WWT championship and Cabo being the springboard, the springboard for his success and visibility around the globe. You talk about that a lot.

SPEAKER_01:

Absolutely. I think that was the crucial um differentiator as to our success.

SPEAKER_03:

Yeah. No, I any it it must be talked about a lot. And I, you know, we we humbly and gratefully are you know appreciate that he's been able to do that over the last year.

SPEAKER_04:

Well, Matt, I just want to apologize for I was on his his uh foursome, and I think they would have won if it wasn't for me, because they had Luke Donald.

unknown:

So yeah.

SPEAKER_03:

You broke a couple clubs, maybe accidentally, not intentionally.

SPEAKER_04:

Sorry about that. Yeah.

SPEAKER_01:

By the way, winning was not easy. You did you notice I was limping as I came up here? I did notice you. And so we had to give it all. Remember Tiger Woods played through? Yeah, you left it all in the field. Yeah, absolutely.

SPEAKER_03:

I love it. I love it. Well, in all seriousness, you you think about your organization, and you know, we we we we had a name of a the$4 trillion dollar engine drive in the future of industry. Last week it it eclipsed the five trillion and come back a bit. It's only at 4.6 trillion in market cap now. Um, but the the way in which NVIDIA is getting visibility and attention and and how how does that impact your day in and day out jobs? And and and does the business, you know, have any level of unsettling because Jensen is such a public figure now, and how do you all need to operate internally? Because again, there's such a draw on NVIDIA externally, but you got to keep this engine going. So how how are the dynamics within NVIDIA?

SPEAKER_01:

Yeah, it's it's been uh quite a year uh since it was here and you kicked it all off. Uh so uh yeah, very busy. And I would say, you know, there's three areas that we've sort of been focused on. Um one is, of course, uh, as you mentioned, all governments around the world uh want to understand what should be their AI strategy. And uh, you know, they're very interested in in discussing that with us because at this point they all understand that building AI factories and AI infrastructure, uh, you know, that's essential infrastructure, just like energy or transportation uh or the internet or what have you. Uh and it's gonna make a big impact into their local economy, depending upon, you know, not only how they set up these AI factories uh for uh uh you know security reasons, sovereignty reasons, and so on, uh, but setting them up properly so that the whole local ecosystem of startups, research community, and so on, uh, you know, that sort of economic engine is happening within the countries. Doesn't mean that they're not gonna partner globally, that's obviously necessary, but that's been a big focus and you know, trying to work on that. Uh the second area that we've been quite uh uh focused on is uh you know the whole geopolitics of it all. Uh I mean you heard about all the concerns with you know China and everything else. Um and all of that is real, and there's many, this is a very complicated situation. Um but one thing that we have to remember through all of this is that uh uh the US has been the leader in every technology uh uh you know disruption that has happened to date. And it is extremely important uh that the American platform continues to be the leading platform for AI.

SPEAKER_03:

And speaking of that, Jensen just said this week that he's concerned about the competitiveness with China and how fast they're mobilizing their AI competency and LLMs and so forth. What do we have to do as a country then to keep that leading edge? Because they can mobilize people different than us and maybe more effectively than us based on how they're organized. But at the same time, we need to keep that innovative edge.

SPEAKER_01:

Yeah, so you know, I think the fundamental to having the leading platform is you have to have the most developers on it, right? Developers basically make the platform. And uh more than 50% of the AI developers are in China. Right. So uh we have to make sure that we continue to protect our technology and so on, but at the same time, uh we have to make sure uh that uh you know our platform is accessible all around the world, not just in China, uh, so that uh uh all the innovation is done on that, and then that will drive uh you know the U.S. advantage in economic growth. So I think that's the most important part. And we don't want to get that mix uh missed. Uh you know, we have to have the right balance.

SPEAKER_04:

And to go ahead, Charlie. Well, no, I was just gonna add, I think that uh you see the administration in the U.S. moving much more quickly lately. I mean, um, this has already been happening at the state level with public-private partnerships to drive economic development and and local venture formation and so forth. Um, you know, I've been in the supercomputing industry for probably 20 years, and there was a 20-year moonshot, if you were in that industry, to build an exaflop supercomputer in a 20-megawatt envelope, and now we're building these gigawatt data you know factories. But if you sell the announcement that the Department of Energy made with NVIDIA and Oracle last week, I mean that investment there is 2,200 times as large as the exascale system that uh you know went into Argonne National Labs. And so you see this, I think there's going to be an explosion. And it's probably limited by energy, as was discussed earlier. And I think the administration is working on relieving those constraints as well to allow, you know, to allow the flowers to bloom.

SPEAKER_03:

We've used a couple times the concept of AI factory, and and again, to many people, AI factory or factory is car manufacturing, is brewery, is building products, and that what are the ingredients of an AI factory just to level set our audience and just to be on the same page? What's involved really in an AI factory, maybe, Craig?

SPEAKER_02:

Yeah, I mean, I think the simplest thing is electricity in tokens out. I mean, token is the currency of the global economy in the future. And with that comes an enormous amount of complexity. Building an AI factory isn't for the faint of heart. Uh, it starts with energy. You just talked about that. Energy will be the foundation for what we build. Then there is an entire compute layer. This is a new compute architecture. It's fairly new to most enterprise customers, it's new to the world. I think we showed up roughly 10 years ago, Jim, and introduced you to what GPUs were. And you return the ask by why did you leave your old company and go work for a gaming company at the time? But there is a compute architecture that's incredibly important, and then you layer on top of that all of the software that the ecosystem and the open source community and that the world is building. And it's one of the reasons why China is so important. If these developers are building open source uh AI models, we as a world want to take advantage of that. And that all lives on top of that factory. And then the most important piece, you close with enterprise data. I mean, at the end of the day, these enterprises have been sitting on a heritage of enterprise data for decades. And that's what's unleashing all of the opportunity in the enterprise. And that's not for the faint of heart. Uh, there's an enormous amount of work to do that, and to do it in a way with skill and credibility, which is why worldwide's been such an amazing partner for us and all the customers here in the room. Uh, that's at the end of the day, what the the AI factory provides.

SPEAKER_03:

So the AI factory can be an entire data center with full stack applications, a fractional data center, uh, a fabric of multiple data centers, can be conceptually the AI factory that's running certain workloads for a business. Is that fair? Absolutely. Yeah, go ahead, Trent.

SPEAKER_04:

Well, I was gonna say the AI factory concept can exist in the cloud with the hyperscalers, can exist at an at a what we call a neo cloud, and obviously in the enterprise, at its heart, these are supercomputers, which usually take months to bring up in the old world. Um, super complicated, you know, high level of skill. And I think what we've been able to do here and what what sort of resonates is create these reference architectures of which you know how to deliver these things, WWT does, um, that sort of simplify and make it easy to bring up, um, easy to, you know, make uh we have this measurement called time to first train. Um we measure total availability, we measure benchmark performance, but you know, now everybody can, you know, the idea is that every one of our OEMs, every one of our partners can deliver these factories that really are complicated supercomputers and make it more easy to use and deliver the benefits that Craig was just mentioning.

SPEAKER_03:

So to geek out just one step further, token. Everybody's using the word token. And how do we best conceptualize what that means? Why is it important?

SPEAKER_02:

Yeah, I mean, I think about it through the lens of my children, it's words, it's information, it's knowledge. I I was sharing last night at dinner, you know, I have two, I have three children, but I have twins, a boy and a girl. And my daughter walked in the other day asking for us to increase the level of utilization of chat GPT because she was running out of tokens. And my mind was kind of blown, but it basically what it said is she's using it as a co-pilot. She's using access to tools and information that are helping her learn and that she was reaching a capacity of these words, uh, the output of the knowledge that she was trying to learn from from Chat GPT. And it just showcased it at the end of the day, we are all human creatures of learning. And I think uh for this next generation, it's incredibly important uh that we all are learning. Uh, but the great news is we now have these tools at our fingertips.

SPEAKER_03:

So when Jensen says each country should have its own sovereign AI and create its own tokens, and that that's really their intellectual property, and that's really creating data management that is, you know, sovereign to their nation and something that they can leverage for hopefully improvement of, you know, whether it's citizens or how safety takes place. What give us a sense of what Jensen thinks about when he talks about sovereign AI for countries.

SPEAKER_01:

Yeah, so I think uh the the key thing is that to realize uh as we're talking about AI factories, uh this is just like manufacturing, right? You're manufacturing intelligence and tokens are the currency of that intelligence. And so if uh if you want to have a vibrant economy going forward, you cannot outsource all of your manufacturing of intelligence. Uh so uh at a very fundamental level, that's why we call them uh factories, and uh we measure it in all the ways Charlie just described, time to first turn on, you know, the throughput out of the factory and so on, very different than a data center where you're trying to reduce costs and so on, right? So I think countries are realizing that. And then on top of that, you have uh this issue of uh you want to have your own data and your culture and your heritage, et cetera, reflected in the models and the intelligence that these factories are producing. Uh and so that is also extremely important, especially when it comes to building then uh citizen services on top and the you know security implications and so forth. Yeah.

SPEAKER_04:

I have an interesting little factoid for you in terms of tokens. I mean, it is everything, and and you know, tokens in, tokens out. Um, you know, back in the early days, which was like maybe a year ago, um, the the the you know, we used to think that you know 50 tokens a second per user was a good thing because that's how we would how fast we would read. But now with the gentic uh you know computing and reasoning, it's agents talking to agents, and the the sort of throughput has to be not just 50 tokens, but you know, hundreds, thousands. And so we kind of live on the roadmap in terms of driving token performance through the roof and driving concurrency. Like how many concurrent users can you have at any one time? What's the you know how many tokens per second per you know per user? And on those metrics drive our our whole roadmap at a full platform level. Uh and it's fantastic. And it's gonna explode, it's gonna continue to explode from everything we've seen.

SPEAKER_03:

Yeah, and it and speaking of that, you know, we had a moment earlier this year, deep seat came out and market kind of went into a bit of a trough. Uh anything else that you all are seeing that's a looming potential threat to this AI adoption moving at the pace that it should, that's you know, in your kind of dialogue internally and what you're seeing externally?

SPEAKER_01:

Yeah, I mean, there's a lot of talk uh, you know, sometimes on these financial shows about an AI bubble and so forth. Um, but frankly, you know, I think those of us who are in this business and see what the technology is capable of and how fast it is progressing, uh, we are at such an early stage uh in this whole game. And uh uh, you know, again, AI is gonna transform the entire economy of the world. And that is more than a hundred trillion dollars in terms of GDP today. So uh we are already seeing uh uh the productivity improvements uh that AI can uh can have. And if it's only say 10% or so, you know, that's uh and I think that's very, very low estimate that we will get to very quickly, that's like$10 trillion, which requires, you know, four or five trillion dollars of infrastructure having to be built every year. And we are just talking about a few hundred billion at this point. So it you know, I I really think that we have a long way to go. And if you believe in AI, uh and uh at this point I think you would be silly not to, uh then uh uh I I frankly think that uh you know there is no bubble here.

SPEAKER_03:

Yeah, so that's so the 80 billion from Brookfield or the 25 billion from Apollo or the 40 billion from you know Blackstone, uh it's daunting numbers, but candidly when it adds up, it isn't it's a fraction of what the trillions are.

SPEAKER_01:

Uh TAM, right? So overall you know GDP uh that we're talking about transforming here. It's no longer just the one trillion dollar IT industry that we are changing. We are changing the overall economy of the thing.

SPEAKER_04:

Maybe to make it a little more, I mean, you know, we're Craig and I spend most of our time dealing with the enterprises and governments and in in the Americas. Um and you know, it's another indicator is it's still early days in the enterprise. And yet already, you know, you can see the explosive growth and let's just call it tokens. Um and so for us, we've you know, all of that infrastructure build out is fantastic, but it's serving demand that's being driven from consumers and enterprises. And when you look at their use there, um, I think it was said before, you know, what's preventing 1.6 billion people from using it versus the current number with OpenAI.

SPEAKER_03:

I mean, I it I think there's still a long, a long, you know, well, a good example to TAM too, like uh we JP Morgan um example is their IT budgets, what, 14 billion or so, but their operating budgets 97 billion. And the the reality is the TAM for AI is more so uh uh accustomed to the 97 billion in terms of how their operating expense is going to be run versus just traditional IT.

SPEAKER_02:

There I think there's also still a human capital component of this that is challenging for many of us, which is the developer ecosystem. I mean, we always say to do this, you need kind of three things. You need the talent to do the work, you need the system to do the work on, you need to have the executive support, the leadership. These systems aren't for the faint of heart, they're very expensive. But a lot of the developers come through the educational system that we've built here in the Americas. You know, they come through the colleges and the universities and they're trying to get trained on the latest techniques, and they're asking for access to supercomputers. They're accessing their own AI factories inside of higher education and research, and then those people are graduating and going to work for the world's enterprises. And so we have to make sure that we provide that continuous learning environment for all of these new tools that we're we're talking about. And the we're learning new tools. I think the message was we learn new tools every day now. But we got to make sure that we have continuous developers in the world that can help us move it along.

SPEAKER_03:

So back to Jim's point earlier, developers aren't gonna go away. It's gonna be a transition of what the developer is focused on, what the skills that they need, and and so forth. So it's it's it can be a a manpower boom in terms of the skills transition that's gonna be necessary in the space.

SPEAKER_04:

By the way, we're all in using uh the technology ourselves and track metrics like number of bugs fixed, you know, per person, number of lines of code submitted. Um you know, if you've not been in the semiconductor industry, moving to a one-year cadence across the whole technology stack is an amazing kind of never before done feat. It couldn't be done without the use of our own technology in our own development and RD cycle. And you see that playing out across industries elsewhere, using it to increase the level of innovation and speed.

SPEAKER_03:

And speaking of that, maybe we can go to the video and a little bit of uh audible and uh play the AI apps video, if you can do that just to get a sense of use cases that are coming to life.

SPEAKER_00:

Every day, enterprise networks carry the weight of the digital economy. Billions of transactions, healthcare systems, supply chains, national security. And yet, the teams who keep those networks running are under immense strain. This isn't just a hiring problem, it's a business risk. Because when outages hit, there aren't enough skilled hands to respond, and innovation slows to a crawl. If organizations can't find enough engineers to run their networks today, who's going to run the networks of tomorrow? At Worldwide Technology, we believe the answer isn't just finding more people, it's equipping more people to succeed. Introducing the Agentic Network Assistant, an AI-powered coworker designed to close the talent gap and transform how teams manage networks. This isn't just about automation, it's about confidence. The Agentic Network Assistant changes that equation. It gives new hires the confidence to act, to learn, and to grow, all while freeing senior engineers to focus on automation, innovation, and strategy. Here's how it works. An engineer types, what interfaces are currently up? Behind the scenes, the assistant translates that into the right command. It runs across devices, collects the results, and presents a clear, human-readable summary. It's almost like a chat GPT for your network devices, but with guardrails, logging, and security built in. No memorizing commands, no paging through endless logs. Instead, the assistant highlights what matters in a fraction of the time. Now, let's simulate an outage. A switch interface suddenly goes down. The assistant immediately flags the issue, reviews logs, and generates multiple AI-driven remediation steps, each executable with one click, each auditable, and each safe. Let's look at a real world example. At a major telecommunications provider managing more than 10,000 devices, the results were dramatic. 52% reduction in events, 88% faster resolution times, two and a half full-time engineers freed to focus on strategy. In other words, junior hires were more productive within days, and senior engineers finally had time to innovate. Networks are the lifeblood of the modern economy, but there aren't enough skilled engineers to run them. But with the WWT Agentec Network Assistant, that story changes. It's more than automation, it's about building the future workforce, and it's here today.

SPEAKER_03:

Tell me about the 5% that's working. That's an example there. What are some other use cases across industry that we've seen blossom up that are the trailblazers that are really bringing this to life to impact their business and position themselves for a competitive advantage?

SPEAKER_04:

Well, I mean, you know, uh, by the way, we host about 300 executive visits a quarter just for the Americas to get alignment between, you know, the C-suite and the boards and bring it back down. And our heritage has usually been, you know, sort of bottoms up from the developer level. Um, and we've got all this acceleration, all these kind of use cases that people can take advantage of. Um, but you know, I first off, I've kind of questioned the we looked at that MIT report, we didn't say anything publicly, but you know, it was not peer-reviewed, so we kind of question the data. Um, we think the success rate's a lot higher than that, actually. Um, hopefully, you know, you and and some of you would agree. But um, you know, at the end, uh what excites me is the people that seem to be leading are the ones that kind of leaned in early, just knowing it was going to be transformative and did it from sort of a tops down perspective. Set up uh, you know, and there's a sort of a loose tight aspect to it, which is you let it bloom in some areas and then you kind of try to control it and put some frameworks around it in other areas. Um you know, the ones that excite, there's a lot of money saving applications because you know, saving money is a big deal. So, you know, customer service, um, you know, uh factory floor automation, some cases, um, code assist is a big one, you know, sort of coding. Um I I like the ones that, you know, are sort of in place to change the business and be more competitive and innovative and increase revenue. Um there's one, it's a semi, it you know, I I can mention what they're doing, but it's one I always reference. ASML builds these extreme ultraviolet machines. And it turns out there's a plasma tin problem that's very tricky. It's all based on physics. Um, and they want to use AI-enhanced physics to sort of build the next generation of EUV machines. So they're tackling, they chose to tackle the hardest problem, the one with the highest payback for them. And of course, we're super interested because you want to see the technology keep keep evolving. But they're the ones I see are the ones who are picking the sort of um, you know, big challenges to solve and getting in early to start to learn, knowing that it's changing so fast, as some of the others said. Um, I'll say the other thing. Often when we are in an advisory role, and oftentimes we're with WWT kind of in there together, um, we kind of matrix the sort of big payback items with the degree of difficulty of doing, and you try to pick the ones that are the easier to do with the biggest payback as the starter point rather than picking off the low-hanging fruit or the ones that are very hard that uh that have big payback but are also very hard. You kind of leave those for a little later once you get going.

SPEAKER_03:

Any others that come back?

SPEAKER_02:

Yeah, I mean the the most horizontal one is the classic customer experience, the chat bot. I mean, and we're all using it today. The question is, how far are we going to take it? And what we're seeing is it's going across multiple industries. Retail is one of the best examples, obviously an omnichannel environment. You're not sure how you're gonna interact with the brand. It may be through a physical interaction of the store, maybe online, and then you're starting to bring in recommendation systems to provide recommendations to us based upon who we are and what we do and what we love. And so that that experience in the chatbot functionality is super important because you're you're creating a new architecture. You're creating kind of a rag-based architecture, you're taking a whole bunch of intellectual capital and then combining it with kind of a human-based experience. And then the question becomes how do you even take it one step further? I think that's where you know the concept of a digital human or a digital concierge where we're interacting with um not chatbots, but actual uh human beings, if you will, but in a digital form, a true agent, and you're getting experience as if you would if you were interacting with a human. That is the holy grail. And uh we're starting to see that come across a whole bunch of different industries.

SPEAKER_03:

Yeah, and we talked a little bit last night about digital twin and omnivision. Where do you see that going, Jay?

SPEAKER_01:

Yeah, I mean, in in future, uh, you know, everything that is in the uh physical world will have a digital twin when you're talking about factories or warehouses or whatever. Uh and then uh you know you will perfect uh the operating model things in the in the digital world, and then you know you can then put that into the physical world and it'll be a it'll be a model, right? So omniverse platform from NVIDIA is perfect to build a digital twin uh of all of these.

SPEAKER_03:

You're just gonna drive efficiencies and improve your operating model. And so if I'm an enterprise customer and I see some of the spin that's going on, and I may not have 250 million, 500 million, or a billion to get started. What I find really impressive by NVIDIA is building this ecosystem of NeoCloud and NVIDIA cloud partners. Talk a little bit about that because I think there's an easy onboarding than building your entire factory, which will take time. And you know, that will be a multi-month process. And so tell us about that network of NeoCloud and NVIDIA cloud partners and what that could do for enterprise business to get them off the starting blocks.

SPEAKER_02:

Yeah, I mean, I almost think it back to the deregulation of the telecom industry. We had a handful of very large telecommunication providers, then we broke it up, and we have a whole bunch of you know, uh regional providers that are providing value and service to that local community. The same thing is happening in the compute space. You know, we're seeing these neo clouds or cloud providers build custom stacks and infrastructure using reference architectures from NVIDIA so we can guarantee performance of the AI systems that are running. And they're using the intellectual capital that NVIDIA's built in partnership with our ecosystem so they can get guaranteed performance on tokenomics. I mean, they're you know, we are talking about high-powered systems obviously trying to reduce the cost of tokens. So we're seeing neo clouds pop up all over the world in the 18 or so or maybe 16.

SPEAKER_03:

In the US, I mean there's uh Core Weave obviously gets a lot of attention because it's gone public, but there's uh Bulcher and there's Lambda and there's Scott, and so all these have capacity already to start training your own internal models, right?

SPEAKER_02:

Yeah, absolutely. What we're seeing is that enterprise customers want to get started. Sometimes it's hard to get started. It's hard to build a data center or an AI factory. And so they want to start small, they want to iterate, they want to get uh you know ideation up and running. And so neo clouds are a wonderful place to go. Many of them also enjoy wonderful relationships with the large cloud providers. So they're running a hybrid architecture between a large cloud provider, maybe a neo cloud provider, with the end state potentially being a multi-cloud hybrid architecture because data gravity is very real. If certain customers are running, let's say, a large uh uh data lake or a data warehouse that's on-prem, hard to forklift upgrade that entire thing to the cloud. So we're seeing compute come close to, as we can, to where their data lives. And we'll we'll probably live in that hybrid world for the foreseeable future.

SPEAKER_04:

There's there's probably two things. One is there's an incredible amount of innovation, entrepreneurship happening in the sort of cloud space and the data center build-out space. There's, you know, we talk about land, power, and shell, which is data center, and a lot of their matchmaking is going on, matching demand and supply, and people have, you know, there's deals happening all over the place. I think Jeff, as you said, you know, capital is not the problem right now. It's you know, sort of other things. Um if you're an enterprise though, um, you know, I think it's daunting to think, God, I got to spend all this money to build a gigawatt data center because that's what Elon's doing. And and um, you know, we try to make sure you can build, you know, you can have a small AI factory, one that's in your car, one that's you know in a retail establishment doing, you know, great things. Um, you know, and uh we now have solutions at all levels in grades and power consumption to provide what you need, you know, sort of when you need it. And so, you know, we just had one major bank as a good example because they consume, you know, they want to consume in all sorts of different environments. And so they just put in their first air-cooled factory. It was going to be liquid cooled, but then it would cost a lot to retrofit their existing data centers. And they just want to get started and not wait a year, year and a half to get all the you know equipment in to build that data center. Meanwhile, they're working with, you know, uh, you know, a neo cloud and they're running a set of workloads and you know with them and long-term, but they also have a big uh CSP arrangement as well. And so they're using this as a way to sort of judge where am I going to run certain things uh most efficiently, where my data gravity is, and so forth. And I thought it was actually a smart strategy because it wasn't one or the other, and this is what we're seeing play out across enterprises. It's a uh it's a more hybrid environment. And so our goal is to make sure you can, and you know, our collective goal is to make sure that as an enterprise you can consume it where you want, how you want, with the you know, with the resources you have.

SPEAKER_01:

Yeah, and uh Matt, uh this is one of the advantages of uh you know working with worldwide and the Nvidia platform because the NVIDIA platform is available everywhere, right? It's available with all the large cloud service providers, it's available with the neo clouds, uh obviously uh it's available if you want to do it on-prem. Right. And ultimately, uh people are going to have an hybrid environment. And the only way you can do that and not be you know locked into one particular silo is to use the NVIDIA platform and not just at the infrastructure level, but as you know, we are doing a lot to innovate all the way up the stack and provide the software platform, uh, NVIDIA AI enterprise and omniverse, and then the blueprints on top to allow uh you know companies like worldwide to actually pull solutions together for enterprises and other customers.

SPEAKER_03:

Yeah, there's a whole library of blueprints that are tailored to certain industries and being able to mobilize use cases rapidly based on already creating best practices that we worked on together. And that's uh a real effective tool to to ignite you know your your launch. Most organizations, again, have some kernel of of value that they're seeing. Now it's like, how do we get it out pervasively and how do we secure it? Um so a couple other questions, and I want to open up to a QA. But just as a student of business success, you know, since we started this tournament, you've gone from 16 billion to 160 billion over the last five years. Chat GPT tells me you're gonna be at 360 billion in the next five years. So good luck with um that you're responsible for that growth. So it's gonna be an interesting ride. But how do you scale the business to produce the type of chipset platform and then extend that that is fascinating? I mean, going from you know, 10x where you were in five years and you're actually having the manufacturing capability and being able to put this into the market is unprecedented. What's that look like in terms of the operating market?

SPEAKER_01:

Well, uh, you know, I think one of the first things is uh uh you shouldn't get psyched out about numbers, right? Everything is relative. Uh if the opportunity is hundreds of trillions of dollars, then you know, talking about the numbers that we're talking about, you know, a couple of hundred billion dollars going to fourth, you know, these are small. Yeah. So rather than looking at absolute numbers, you always have to look at the total opportunity. Now, of course, it's not easy because it's a very complicated supply chain, and you know, you have to make sure you're uh you're getting capacity in advance. And um, you know, our operations team does an amazing job. But the most important thing to be able to continue to scale as a company uh is to not uh get too uh you know hung up on the absolute numbers, but look at the overall relative opportunity.

SPEAKER_03:

So you you've been with the company, what, 20 years now? Yeah. Did you did you see a point, a pivot point, uh aha moment throughout that 20 years? Like did you see a point of which, wow, this is going to be a much broader impact of what you started with 20 years ago versus where it is today, and it's going to impact every citizen and organization around the globe.

SPEAKER_01:

Well, you know, I'd be lying if I said I I saw that uh all the way through because uh, you know, AI obviously has completely changed the game. Um but uh Jensen always says, you know, you got to believe in what you believe in and then just invest. Uh and uh, you know, we've believed in accelerated computing from the day the company got started 30 years ago, you know, um much before I got there. And then about 20 years ago, uh it originally started with uh uh sort of accelerated computing being essential for video games. And uh, you know, that's the only way you could uh sort of simulate these virtual environments. To about 20 years ago, when I joined, uh Jensen was convinced that we had Moore's Law was tapering off, and now we had to go and accelerate other domains. And the only way to do that would be uh to use uh GPUs and then do full stack acceleration all the way to the application level. And so we've been uh very uh focused on that from the very beginning. And then when AI had its renaissance and uh uh you know, machine learning and so on, we just completely pivoted to that. And I believe that the reason the company is as valuable as it is is because uh we accelerated computing by a million times in 10 years after uh you know sort of the AlexNet moment happened. And uh none of this would be possible uh without uh you know getting computing to get that much faster, which is exactly the uh the same thing as getting it to be that much cheaper because they're two sides of the same coin. Yeah. And at this point, we've got to continue to drive the cost of tokens down so that with agentic AI, reasoning, et cetera, you know, we can use more and more intelligence, more and more tokens to solve really complex problems.

SPEAKER_03:

Well, there's been few things in history that have been as historically significant as what you've been a part of in the last 20 years and yourselves at NVIDIA. And it is a uh not only an impressive technology, but equally impressive leadership team as well as culture. And uh we couldn't be more proud to be your partners and look forward to the next decade or two together.

SPEAKER_01:

So yeah, thanks, Matt. But I've got to say something, uh, which is uh uh you know, worldwide is one of our best partners. And I don't know how many years it was, but it was only a few years ago that Jim came to NVIDIA, met with Jensen and our management team, and he brought all of you know you guys together. And uh, you know, we talked about what would it take for worldwide to play an important role in this new transformation that was happening. And uh, you know, uh we talked about how Jim had this vision of how he was gonna transform worldwide uh into being a company uh that would be really able to go deliver solutions, not just at the infrastructure level, but even at the solving real business problems and so on. And you know, you guys made the investment uh with your technology, advanced technology centers, AI proving grounds, hiring all these people that are experts uh in being able to deliver that. We saw some of that in the video there. Uh, you know, not too many companies uh uh sort of made the investment uh to transform themselves to be real players in this world. So I you know I tip my hats off to you guys, and we love working with you. And all the feedback from our customers is that you know, you're such a great, great partner for all of us.

SPEAKER_04:

Thank you. Thank you. Yeah. I just wanted to have my I just wanted to kind of mirror and and and sort of press express my appreciation as well. I mean, you might not you may or may not know this, but we don't actually deliver any of this of our platform directly to enterprises. It's always through the ecosystem through our partners. Um we're organized into 17 different industry verticals so we can understand the you know the industry problems. And there's horizontal ones, but there's a lot of very vertical applications and use cases. Um WWT is organized in much the same way, and so there's a great you know connection there. But at the end, you need a partner like WWT to deliver the technology, make it successful, make it work in the enterprise. We provide the platform, but it has to be turned into a solution for our customers and for your customers, for you all.

SPEAKER_02:

So maybe I can close out the thank yous. It's been almost a decade, it's been almost a decade now, and I'll tell you, you know, we use a benchmark of quality of talent, and we always look at our own house, the NVIDIA's of that amazing technical benchmark of delivering value to customers. And I say there's very few partners in the world that can go toe-to-toe with us, and WWT is, I think, the the number one there. Your technical talent globally is tremendous. When we're talking about building what we've discussed for the last 40 minutes, it isn't for the faint of heart. And to be able to lean into a relationship like worldwide at global scale, this isn't just about here in the US, this is about the world at large and where you guys are playing. I just want to tip my cap to you guys in the golf metaphor. Uh it's been uh it's been an amazing journey. And Jim and Matt, your relationship with Jensen is a big piece of that. He's looking to you guys and your leadership to help us take us over the next decade.

SPEAKER_03:

Well, I'm grateful to be executive sponsor for NVIDIA. It's been a really uh fulfilling in the last couple of years or more, and and uh and candidly, I think we're all in a great spot based on, you know, there's few organizations in the tech space that have leaders like Jensen, Jim, Michael Dell, maybe 35 years at the helm. I mean, that's unprecedented. So I again I have great uh respect and you know great appreciation for where I'm at. It's a special company, and I see the same in NVIDIA, and again, sky's the limit for both of us.

SPEAKER_00:

Okay, a key lesson. AI is no longer defined by algorithms or hype cycles, it's defined by infrastructure that can turn data into intelligence. Nvidia's leaders remind us that this moment won't wait for anyone. The organizations that lean in, experiment early, and build deliberately will shape the future. Those that hesitate will inherit whatever's left. This episode of the AI Proven Ground Podcast was co produced by Nas Baker, Kara Kuhn, and Diane Swank. Our audio and video engineer is John Knoblock. My name is Brian Felt. Thanks for listening. We'll see you next time.

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

WWT Research & Insights Artwork

WWT Research & Insights

World Wide Technology
WWT Partner Spotlight Artwork

WWT Partner Spotlight

World Wide Technology
WWT Experts Artwork

WWT Experts

World Wide Technology
Meet the Chief Artwork

Meet the Chief

World Wide Technology