Denoised
When it comes to AI and the film industry, noise is everywhere. We cut through it.
Denoised is your twice-weekly deep dive into the most interesting and relevant topics in media, entertainment, and creative technology.
Hosted by Addy Ghani (Media Industry Analyst) and Joey Daoud (media producer and founder of VP Land), this podcast unpacks the latest trends shaping the industry—from Generative AI, Virtual Production, Hardware & Software innovations, Cloud workflows, Filmmaking, TV, and Hollywood industry news.
Each episode delivers a fast-paced, no-BS breakdown of the biggest developments, featuring insightful analysis, under-the-radar insights, and practical takeaways for filmmakers, content creators, and M&E professionals. Whether you’re pushing pixels in post, managing a production pipeline, or just trying to keep up with the future of storytelling, Denoised keeps you ahead of the curve.
New episodes every Tuesday and Friday.
Listen in, stay informed, and cut through the noise.
Produced by VP Land. Get the free VP Land newsletter in your inbox to stay on top of the latest news and tools in creative technology: https://ntm.link/l45xWQ
Denoised
How AMD is Blending Tech & Art - James Knight
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
James Knight, AMD's Global Director of Media and Entertainment, shares some behind-the-scenes insight into his vast history of working on some of the biggest films in Hollywood, from the original Avatar to Gareth Edwards' The Creator to Pixar's Elemental.
We also dive into:
‣ How AMD works with filmmakers and studios to push filmmaking technology forward
‣ Why virtual production is not just about GPUs
‣ Where AI tools fit into creative workflows
And a whole lot more!
📧 GET THE VP LAND NEWSLETTER
Subscribe for free for the latest news and BTS insights on video creation 2-3x a week:
https://ntm.link/vp_land
📺 MORE VP LAND EPISODES
Inside Coffeezilla's '$10,000,000' Virtual Production Studio
https://youtu.be/FjAkmqJCbJY
Fully Remote: Exploring PostHero's Blackmagic Cloud Editing Workflow
https://youtu.be/L0S9sewH61E
Imaginario: Using AI to speed up video search and editing
https://youtu.be/4WOb5Y1Qcp0
Connect with James and AMD:
AMD - https://www.amd.com
YouTube - https://www.youtube.com/@amd
Twitter/X - https://twitter.com/AMD
Instagram - https://www.instagram.com/amd
Facebook - https://www.facebook.com/AMD
TikTok - https://www.tiktok.com/@amd
James @ LinkedIn - https://www.linkedin.com/in/james-knight-9a576
James @ IMDB - https://www.imdb.com/name/nm2595922
James Knight is the Global Director of Media and Entertainment at AMD (Advanced Micro Devices). He leads a team that supervises alliances with Visual Effects, virtual production, video streaming, and post-production computing applications. Knight and his group also work directly with the world's most prominent studios, media brands, and sports and entertainment platforms. Knight's work spans over two decades in production, post-production, visual effects, and virtual production. He is a member of the Visual Effects Society (VES), a the British Academy of Los Angeles (BAFTA) where he served on the board from 2010-2016, and the Oscars/Academy of Motion Picture Arts and Sciences (AMPAS).
#############
📝 SHOW NOTES & SOURCES
Check out vp-land.com for all the show notes for this episode.
#############
⏱ CHAPTERS
00:00 Intro
01:30 James Knight's Journey in Filmmaking
03:45 The Impact of Avatar on Virtual Production
07:40 The Virtual Production Umbrella
12:25 Flipping Post-Production and Previs
17:19 Evolution of Virtual Production
21:52 Technical Aspects of Virtual Production
25:00 CPU and GPU Working in Tandem
31:24 AMD's Role in The Creator
35:06 The Impact of AI on Film Production
40:01 AMD's Collaboration with Pixar for Elemental
48:12 The Role of AMD in Camera Technology
52:46 James' Thoughts on Approaching AI
53:45 Outro
We're not just selling CPUs and GPUs into Hollywood. We're figuring out with the studios what works, what doesn't work, because without technology, films are just plays, right?
That's James Knight, Global Director of Media and Entertainment at AMD. James has a long history of working at the cutting edge of filmmaking, going all the way back to the first Avatar.
That was a light bulb moment for me. It was a perfect marriage of technology and art and storytelling coming together.
In this episode of VP Land, we talk about how James and AMD worked with Gareth Edwards on The Creator.
He came to AMD to get his AI education, so to speak.
How workflows have been changing and speeding up production.
And I remember him saying that he was able to do more setups than he's ever done in that short amount of time.
Why running volumes for virtual production is not just about GPUs.
The importance of compute has never been as important as it is now. If you don't have a particular CPU, then it's going to be a lot slower. And that's time and money. And a whole bunch more. That's a good question. I don't not really been asked it before, but that should have been.
Links for everything we talk about are available in the description. And for more content like this, be sure to subscribe to the VP Land Newsletter at vp-land.com. And now enjoy my conversation with AMD's James Knight.
Well, James, nice to meet you. Thanks a lot. I appreciate you joining. Why don't we just start off a bit?
Just tell me a bit about your background and how you ended up at your current role and what you're doing at AMD.
A bit about my background, I started out in visual effects. I've been in film post-production and visual effects about 24 years now that it's 2024. And then I moved to Los Angeles in 2006 to late 2006 to be the I was the.
motion capture, performance capture, project manager on Avatar. I have two words for that, an education, right? So that was great to talk about, but anyway. Yeah, and then a lot was learned, and a lot of connections made, a lot of friendships made. And I think that I ended up having some great relationships with people in Los Angeles, and I decided to stay.
I was going to leave after Avatar, but I stuck around. My family's based here now and I've been here ever since, so that's now 18 years. And I've- shifting slightly from visual effects production to the technology side of visual effects because without CPUs and GPUs, there is no... Without technology, films are just plays, right?
So that's what I'm doing is I work with the studios and I work with creatives. I work directly with the directors. It's kind of a really fun, nuanced role, but I got in it because I was, the area of visual effects that I was working in that I moved to Los Angeles for was pretty technical. And then I got approached by AMD in 2015.
2016 to start working with them and so that's how I've gotten even more ingrained in the visual effects pipelines. So that's anything that's created and that is fake essentially in the screen, right? Set extension, CG characters, rendering, packaging content, delivering it, everything. So it's been a wild ride.
But that's in a nutshell that's kind of like my journey and how I got to where I'm going
Nice. And actually I did want to ask you about Avatar because I saw there was something you mentioned in another interview Where you said that avatar looking back kind of felt like one of the first time instances of some form of virtual production So can you talk a bit more about how- are your impressions of that like why you say that?
Well, before the software that was used on that film was created by a company called Giant Studios, which is no longer, but those brilliant guys were out of Atlanta, Georgia. And it was used on Lord of the Rings and a few other video games before Avatar.
Avatar was when everything was brought together. It was almost like at the time, I would describe it to my parents as, imagine being able to look. I use my parents as an example because essentially they're neophytes and they wouldn't understand what I was doing, right? So I try and tell them what I was doing.
Imagine if you could see a PlayStation 2 game in real time and that you were filming that and then it was going to later be rendered to photographically real. It was very much Jon Landau, the producer, would have multiple people come by and visit the set and so would Jim. and it was really interesting and bizarre to see people's reaction.
They couldn't quite get it initially. That area in the brain, a road hadn't been forged to it for these people. So they would look through the virtual camera and they would say, I don't understand. These people are standing right here, but in the virtual camera, I'm looking and he's in Pandora and there's a river running and I don't quite get.
So we had to kind of dissect it a little bit and show them. Um, and we, and we really thought that if we brought a lot of people by, it was Jon's thoughts actually, but, you know, we were on board, Jon Landau. We really thought that if, if we showed enough people that virtual production, those two words together, virtual and production, would become more of a mainstay in production.
And we really thought everybody would, would be doing this a lot sooner than, than they have. Of course, it didn't happen quite as quickly because I think people think virtual production is only for James Cameron. Well, now we know it isn't, right? You don't have to have James Cameron budget to do it. But it was a great way to make a film.
And I remember the last day that Jim did a, he was called Jim, the last day that Jim shot something with a virtual camera, we all stood around and spoke about the experience of it. uh... and i remember somebody said i don't think audiences are going to quite know what they are watching and hopefully they have an emotional connection to it because of the camera movements going through a jungle you can't have a dolly in the jungle right and the movements of sped up sh following around CG characters as if they really existed uh...
those organic film movements i believe or what drew people in. So i think it was a perfect marriage of technology and art and storytelling coming together. And he was the best director to do that. And so that was a light bulb moment for me. And I think that's kind of what I referred to in previous interviews is art and technology are overtly related.
They're like paternal twins, you know? In that they're not identical, but they were born at the same time. Because without... without technology there is no art. And I really thought that Avatar really showed that in an amazing way.
Yeah, it was definitely groundbreaking. So when you say you thought that it would catch on sooner, but it didn't, was it just a matter of the technology becoming more affordable or becoming more, what led to be the next steps where this did eventually, and I guess what we're talking about here I mean, his type of this, that the Avatar virtual production is a very specific type of production, but I guess what would we classify this now, maybe like a previs in some aspects or a like shooting with a virtual camera in a virtual set with like motion tracking actors.
I'm trying to think how would we best categorize this now and how is this technology sort of been adapted today in more of an everyday production?
Yeah, virtual production still is a kind of a big umbrella. There are subsets underneath it, right? So previs is considered virtual production because it's all CG and it is part of pre-production, but it also can be part of production.
So if you've got a busy street corner, you could have somebody either create that street corner or you could go and LIDAR scan it in. and then you can figure out where you want to point your camera and when you want to point your camera when you've paid to have that street corner shut down so you're not discovering on the day because that's going to burn through cash.
You can also figure out what lenses you want. So for planning purposes, I think that probably it was the wrongly perceived cost. This is amazing, but this is Jim Cameron, so it's going to go. I don't have that kind of money. I think that was the first issue and I think secondly too, there wasn't... It just took a...
Sometimes ironically, Hollywood can be slow to adopt really new helpful technology. And there were... There's a couple of great societies formed out of the guys at Weta and ILM and... Autodesk,
the company that makes the application MotionBuilder, which was the backbone of Avatar. It was what we streamed the data in and then we would deliver the files in this. I think that David Morin, who was kind of a galvanizing person, he created the Virtual Production Society and we would meet about maybe six times a year and try and educate Hollywood.
and we would have events and show people virtual cameras. But it really is tough to push that forward and to have more productions use a method, because virtual production is a methodology, it's not just one thing, there are many ways to do it. I think if I had a time machine, if I wanted it to happen quicker,
I think what we could have done is educate not just the production people, but also the people in accounting, the people in procurement. I think that those guys would have gone, oh, wait a minute, so you could save us more money on discovering where we should point the camera. We could actually figure out what story beats work and what story beats don't.
There are a few companies that were trying to figure this out, like THIRD FLOOR Previs. Glenn Derry started Video Hawks and Technoprops, which techno props ended up becoming Fox VFX Lab. And those guys were virtually rapidly prototyping films. So, you know, I think it's just, when you have something new and you're a disruptive entity, the virtual production arm of Hollywood, it takes a minute.
You know, it takes a minute. That's kind of why it begins to describe. I don't think I'm able to describe everything, but that kind of gives you a hint as to why it was a bit difficult to bring this new way of making either pieces of film or a whole film to Hollywood. Also, I will say one thing else on this subject.
Virtual production is not just for a CG film. You mentioned previs. Previs, under the umbrella of virtual production, can be used and is used now on most Hollywood films to figure out what goes where and what order I should shoot things in and where I should be pointing the camera and the sun's gonna be here, that kind of thing.
So it is used more and more. It just took a minute.
Yeah, and I feel like there's this perception as well, like virtual production requires, I mean, does require a lot more, like you're flipping the VFX from post to pre and building out your virtual sets beforehand and that you might be more locked into specific shots, but you said like, oh, it actually unlocks some more creativity or like, can you expand on that of like how it opens up ideas or opens up?
flexibility to discover more shots or new shots?
It opens up, actually, yeah, it opens up more. It actually allows for it allows you more time because it's a lot less expensive to have a guy in a volume in a motion capture volume. That's what we call when you're in a in a in a motion track in a motion capture area, that's a volume.
If we had a couple of guys in suits. in a CG environment and a virtual camera, you can make some mistakes. And I saw them being made on Avatar. Out of mistakes, you can have brilliant pieces of art. I remember on Avatar, somebody screwed up and had the virtual camera. They hit the joystick. And all of a sudden, Jake and the theory were followed up.
the tree from above. And obviously it was deemed acceptable because that shot is in the film. But I don't think that would have happened if we were shooting it practically. I don't think the time would have been allowed to kind of allow those mistakes. Also it is virtual production does allow for better art, for better storytelling.
because you're eliminating guesswork. And so you're getting in the frame, ordinarily without it, if you're shooting a CG film, you're not using real time animation. You're kind of like, you're either shooting a film plate. So think Jurassic Park. I use this in other talks and when I do presentations on virtual production, if you...
go back and look at Jurassic Park that was released in the 90s, there's one lock-off shot about two-thirds the way through the film where the T-Rex comes through and they're hiding behind a big log and he eats a few velociraptors. The camera really doesn't move. If you had virtual production back then, you'd be able to follow the T-Rex around as if it really existed, but it was much safer back there to just animate to that film plate for the animation people. The thing about virtual production too, it's not just better for streamlining and for story beats and for the actors too, because they kind of know it's not just a tennis ball they're acting to, they can see off set that are reacting to this dinosaur or this alien or this building or there's this set extension even in Boardwalk Empire.
HBO used virtual production for set extensions way back then so then you had a frame. It's also better for the audience. because it's more organic and you have those organically driven character film plates. So I'm doing that because that's how you hold, I guess, a camera. Moving the camera, yeah. Yeah, yeah, yeah.
Also, I didn't work on Tintin, which was a fully virtual production film directed by Spielberg, but I did eat the food on it. It was shot in the building where we were finishing Avatar and so we got to eat catering, right? And I watched Steven do it during breaks and I remember him saying when the film wrapped that he was able to do more setups than he's ever done in his career in that short amount of time.
So he can do multiple different scenes. and walk things through. And so it really was not only a cost savings to him, but a time savings, equaled better story, equal better performance, that kind of thing. So I think, you know, in life, not just in Hollywood, we're always looking for better, faster, cheaper. And I think that, I don't like the word cheap, but better, faster, less expensive.
Less expensive, yeah. Is essentially, is essentially, because cheap's pejorative. less expensive is those three things together are really what I think from the bird's eye view, virtual production brings to filmmaking and television as well.
So how did we go from Avatar and virtual cameras to StageCraft, virtual production on an LED volume, like we've traditionally been associating with, or not traditionally, but now we've more come to be associating with virtual production today.
So I think, yeah, the LED, it's almost a bit, it's almost like the reverse, right? So virtual production and performance capture. So let's take performance capture. You've got Sam Worthington in a suit playing Jake Thully. He's in a spandex suit with markers on him and he's not standing in front of an LED screen.
He's on a gray stage and 100% of everything filmed with video cameras on that. is just for reference for the artist to do the motion edit. And Sam's, his face is never going to be seen by the final, what the audience sees, that's all CG. And this is why I'm saying the umbrella of virtual production is a bit wider now because of the way it's perceived now.
And I should say the most famous part of virtual production now is what was really promoted cleverly and rightly so by the production team Stagecraft or Mandalorian right it's the reverse because now you're taking live-action actors very well very well costume design fantastic props in the foreground and the and the background is you know in some cases Unreal Engine on micro LEDs or it's a ILM's proprietary real-time renderer.
But I think that Mandalorian came out right during the pandemic, right? And so it was almost like the perfect storm that needed to fan the flames of virtual production. So I feel like virtual production really became a little bit of a celebrity. because of the pandemic. So we could actually have smaller crews and we can shoot things that look like they're in all these different locations but they're really on a sound stage.
And I think that it was a great handoff, so to speak, in that was the next chapter in virtual production. And that's not to say, because it isn't to say, that the way Avatar was done is still being used. It's not like those scenes in the jungle or underwater or in The Way of Water. Those are all done CG as well.
And the premise of how the first one was done was used in the second Avatar. But it's taking what the community pioneered in Virtual Production 1.0, if you will. And now I feel like Stagecraft and those guys and Pixomondo, Sony are doing stuff, all different studios are experimenting with their different version of what was shown worldwide on Mandalorian.
I mean, it looked fantastic when I told, because family members are a great way, because I work in film and TV, right? So family members are a great way to have those layman conversations of like, So what did you think of the backgrounds there? Like, you know, where do you think that was shot? And one person might say, oh, like it was done in Jordan or Indonesia or it was shot, like, no, that was shot on a soundstage.
Everything in the background was CG and they thought that was fantastic. But I think, look, there weren't a lot of great things that happened out of the pandemic, you know? Billionaires getting richer, great for them. But one of the great things that happened to us was that it fanned the things of virtual production.
I feel like the pandemic really ironically helped virtual production become more of a celebrity. And now I believe it's more widely used. And those two words together aren't something to where people go, I'm embarrassed to say I don't really know what that means. You know, I think, you know.
Was there also a tipping point just in the technology and the processing getting fast enough to run this in real time?
Would this have been possible five years prior to Mandalorian?
It was possible, but the fidelity wasn't nearly as good as what it is now. So back in Avatar, there was real-time facial. We had it, but it wasn't amazing back then. I mean, that was... I mean, if you think there's a long time ago, it was, you know, what is that, 18 years ago or so when we first started.
Now there's a bunch of those companies like Facewear and multiple others that can do real-time facial and you can do real-time body and facial in tandem at the same time. But yeah, five years ago, it wasn't where it was. Now it keeps getting better. I mean, it's... the backgrounds and some of the characters, if you have hard services, like as long as you don't have necessarily like
biological faces in real time, that's harder to do. But we are getting closer and closer to coming over the uncanny valley, which is, I shouldn't even have said that, that's something else to talk about. But if you're doing a CG robot with a hard helmet, so let's imagine a stormtrooper or pick another.
Everybody always goes for a Star Wars reference, but some kind of something from Battlestar Galactica or
Or Fighter Helmet or something
Yeah, yeah That's that is easier to do in real time You still wouldn't have final right there and then on the stage but we are getting dangerously closer and closer to the final fidelity that you would see on stage, that your audience would see when they go to the cinema or when they watch it on TV at home.
Yeah, it is getting better. And the compute, the importance of compute has never been as important as it is now. So for like, if you open up a session in Unreal Engine or you open up a particular... steam that you want to shoot on, on a virtual stage with those micro-LED screens, it matters what kind of CPU you have in there.
It's got to be high core count and fairly high frequency. So it can render out the scene quickly, so then you can shoot on it. If you don't have a particular CPU... in the system or you have somebody building systems that they don't really know what they're doing then it's going to be a lot slower. And that's time and money because you're going to have a crew you know not a big crew, but you're going to have a crew standing around waiting for four setups and the time between setups is what costs The most money because that's waiting time.
But yeah, the technology is getting better and better it's a it's a CPU and I don't know how technical your audience is.
Feel free to explain because I'm also just very curious as well.
So like, feel free to type into it. How about this? If you think of a computer, I was about to hold one up, but they're all plugged in.
It's like the Batcave in here. I've got like five screens. If you think of a computer like in a computer to a human head, right? The brain will be the CPU, the central processing unit, and your eyes would be the GPU, which is the graphics processing unit. And so... the real-time process is mostly a GPU, mostly the eyes because you need to be able to see that on the set.
During setups and when you're building up a scene and you want to open up another project to shoot this part of the movie now, it has to render those out and that's why it matters what kind of CPU you have because you can't have just the one on your grandmother's laptop because that's not going to work. Even though your grandmother's laptop might have the most high-powered GPU, graphics processing unit, it's not going to matter.
You need the two in tandem.
It's not that complicated, but it's not just as straightforward as going to Best Buy and picking up a workstation. You've got to have it properly configured. So technology does enter into it to sort of answer your question. Technology does sometimes is an inhibitor to certain things, but it's less and less so as time goes by virtual production.
I mean, it's constantly getting better and better, but it's at a point now where anybody who's not seen a virtual production set before, if they show up on a virtual production set, they're going to be awfully impressed about what's capable presently in 2024.
All right, real quick, if you're enjoying this conversation with James Knight about GPUs, CPUs, AI processors, all that stuff, then you will like the VP land Newsletter where twice a week we send a bunch of links and articles and resources about all of this type of stuff.
So you can subscribe to that newsletter over at vp-land.com. It is 100% free. Just pop your email in, you get the newsletter twice a week. Stay updated on the latest trends and tech in virtual production, AI and filmmaking and everything that is changing the way. making movies. All right, now back to the interview.
Does the CPU and the GPU processing power you have? Does that is that affected by how a lot like your virtual code and like to your virtual art design or art department? Are these going in tandem with how complicated your virtual sets are with what you can run on the stage and depending on how fast and powerful your CPUs and GPUs are?
That's a good question. I don't not really been asked it before, but that should have been right. So so I think for the I don't know if I can talk about certain things publicly or not, but there are some interesting real-time rendered venues you can go to, right? There are some concerts and some bands are starting to use virtual game engines in real-time.
And in working with those companies, they describe kind of what you did, like one's not going to be that complicated, one maybe. So what we end up doing is... as a technology provider, it's not just as simple as, here's a couple of CPUs and here's a system, good luck. We become advisors as well, right? That's why I'm saying that the reliance of art and technology really coming together, like it makes sense for the technologists and the artists to be talking, because we have an idea of what it is they want to accomplish.
And because it doesn't make economic sense, right? So a company like AMD wants to sell its technology, but it also has to have an understanding of what the artist wants to do with it. In order to sell technology, you can't make, say there are seven subsets of how real time virtual production goes. It doesn't make economic sense for any company to buy, to make seven different CPUs for one thing.
So what we end up doing is... Um, working with the OEMs, original equipment manufacturers like Dell or HP or, or Lenovo super micro, we end up coming up with systems that will more than cover one, two and three. And then maybe there's a system that covers three- uh, four, five and six. So we know that if, if you're going to be doing anything that falls under this umbrella, you should get this system.
And if you know, so, so we don't. So we end up, the configuration matters, but it doesn't matter that much. We also like that sort of background stuff. One of the other things that's important too, because we are talking about it here is, is not having it be as complicated as I'm describing it. When we have a studio that we're talking to and that we want to accomplish this and this, great, you should have this.
We don't need to get into the pedantics of, well, why are you going to be having this system and this is what the GPU is going to do? That's my American accent. That's my scientific American accent. You want to make it simple for the person using the technology. I think being disarming and making them feel like what you've advised them to get and what you've sold them is...
is going to be the best and that trust that they put in you, it's verified by the people before them that have used the same configuration. So yeah, it does matter and that's why understanding the vertical that you're selling into matters as well. In this case, it's film. The virtual production is going to be, is used by...
architectural is used by car design. It's used in our in medical science as well. So pioneered here in Hollywood and then used in other verticals.
Let's talk about something a bit more recent. Can you talk about AMD's role and work in The Creator? Oh, yeah.
So we work closely with we've been working with Pixar now for a for a while.
For multiple years. Pixar have created, they've had it for a while, their renderer called RenderMan. Pretty much everything that comes out of Industrial Light and Magic, which was born out of Star Wars back in the 70s, is rendered on RenderMan. They are cousins because they're under the Disney umbrella, so Pixar's owned by Disney, and so is Industrial Light and Magic, and Lucasfilm.
And my friend, Gareth Edwards, the film and wrote it. And so I'm going to describe a couple different things coming together congruently. So yeah, he reached out towards the end of the production and wanted, since his film is about AI, wanted our help in marketing the film. Because moviemaking is all about suspension of disbelief.
So if you can have a company that is deeply involved in AI be part of the marketing of a film that's about AI, it's easier for the audiences to suspend their disbelief because it's almost like, it gives it more of an authentic feel. Congruently, Industrial Light and Magic did a lot of the visual effects, the virtual production on the film, which was in Pinewood, UK, which is where I grew up actually, right by Pinewood Studios.
That's for another time. But yeah, Gareth texted me and asked me if we wanted to help market the film. And of course, we said yes. So we we've been working with the agency and with him and with Disney to help market the film. And that was- it was really challenging, I think, to do that during the strike. So the writers' strike and the actors' strike.
So you had to get and you couldn't get gaming influences to the to the premiere because, you know, that might violate SAG and they might not be able to get into SAG. So there were all these obstacles and promoting a film I think was quite hard. So we worked- we worked promoting the film and then it was also rendered entirely on AMD as well.
So the visual effects were rendered and the film was delivered. via industrial light and magic. So it was done on AMD technology and also on Pixar's RenderMan, which is optimized on AMD. So it kind of describes without getting too nitty gritty and pedantic the details, but I think it might surprise people that many people are involved in marketing a film.
Before it comes out, a lot of meetings are had you know, what makes sense and how would this make sense or what would make sense to a consumer. And, and so, you know, having a chip company that works in AI working within helping marketing a film that kind of makes sense. But if it was sound of music three, you know, AMD being part of the marketing, like, what is that?
Why would that wouldn't make sense. So it was kind of like the planets lining up a little bit, particularly in light of the fact that ILM use our technology and that the visual effects were done there.
Yeah, okay. That was an interesting. Gareth has talked a lot about AI in general, just like his own interest or creative uses of AI.
Where has that been on your radar and with any of the tools or uses or anything that's out there?
Yeah, Gareth, well Gareth actually, I don't know if you knew this, but he came to AMD to get his AI education, so to speak. And he also asked. our president and Victor and our CTO Mark, multiple probing, almost philosophical questions about AI.
That was kind of fun. So AI is, it's still an up and comer. Remember how we were talking about within media and entertainment, right? So we were talking about how virtual production was slower than we'd liked, still pretty quickly adopted by Hollywood, but maybe I think that AI is going to be more quickly adopted.
Automatic rotoscoping, automatic wire removal when you have stunts in a shot, frame recognition, frame cataloging. There's all sorts of ways in which AI is starting to be used in film, the resolution of something. So automatic formatting. a certain content delivery network is streaming a film to a device or they're about to be, they'll be able to say, oh, you know what, this is going to be 720p.
I've got to quickly render this out in 720p or 1080p. So even down to once something's made, it'll have a bearing on it. So it definitely has a bearing in my day to day. But a lot of it is in the form of studios wondering what we are doing as a technology arm of the film business. What are we doing? And that answer would be a lot.
And we're working with the software vendors too. The Adobe's, the Blackmagic's of the world, Autodesk, those guys, the guys that create the applications on the silicon. That is, of course. made by us and Intel and Nvidia. So we have to work with the software vendors.
I just want to go back real quick to what you mentioned with Gareth.
So what is on the AMD AI curriculum that you went over, that you taught Gareth Edwards?
One of the things we spoke about, it was recorded is... the uh it wasn't just straight technology it was the ethics behind it um and some of the great use cases are not just straight technology one of them was talking about combating depression and do you remember the movie Her with uh Joaquin Phoenix yeah and how he you never saw- actually, I don't remember.
I've only seen the film once, but I don't remember seeing the woman, the voice, the female voice. I think it was Scarlett Johansson.
Scarlett Johansson. Yeah. But no, you never. She was over. Yeah. Yes.
But
yeah, but it was, it was combating loneliness as well. So we spoke about that, how AI is it can make someone feel less lonely.
So it wasn't just the, the economic efficiencies. uh... the artistic efficiencies it was also uh... psychological and emotional and so we were talking about that as well and gareth hit that in his film if you if you notice towards the end of the film in that big the big battle sequence when they finally discovered where they where they were andy they're running uh...
they're running across the bridge i don't know uh... you remember that sequence but they have multiple robots that are part of the family, right? It's because you don't see what somebody looks like and what form they take. It's how you feel about your interactions and what they say and how they say it. So we spoke a bit about that as well.
And I think that's great. I mean, we're thinking about everything, not just the straight, you know, the inherent overt benefits from AI. going scratching below that veneer, there's the emotional benefits of AI as well. And that's what we spoke about. By the way, his film nominated for best visual effects for the visual effects society awards and BAFTA which I'm particularly happy about.
I saw that. Yeah, well, well deserved. Yeah, I know, bouncing around a bit, but I know there was also because he sort of started on the Pixar and the RenderMan and that they're also was an AMD connection with one of the more recent Pixar films, Elemental. So you want to talk about that as well?
Yeah, yeah, yeah.
So we've been working with Pixar, like I said, for multiple years, I want to say four or five, something like that. And we've retooled the render farm working closely with them. So we kind of machine shop with Pixar in as far as try this. Hey, this works great. But what about the CPU? So we go back and forth and we figured out what their perfect system was for them, balancing their different needs.
And they had told us about Elemental years before they were announcing it. And they had told us that it was their most, it was gonna be their most ambitious film to date because each character was volumetric. It had so many different... ridiculous amounts of polygons and geometries within the character. I mean, you've seen the film, they're all different elements.
So fire, water,
Fire or something.
Yeah. Yeah. And that's a significant amount of compute to be able to render those and re-render them, you know, because you're going to get notes, you're going to render the film and then you're going to get notes from the director and the producers, more of this, less of that, cut this scene, put this there.
And that takes a lot of compute. And so they were able to speak publicly about AMD's role in it last September at SIGGRAPH in Los Angeles. The film was rendered on 156,000 cores of AMD EPYC CPUs, which, I mean, how do you have a gauge? Is that good? Is that a lot? It is and it isn't because it was all in their data center.
It was more cores and threads ever had in a fairly small place. And so they were able to render and re-render and iterate on that big piece of art without bursting to the cloud. And that was a never been done before type thing, which was great to be a part of that. And I love too, that's nominated for best animated feature, which is fantastic as well.
I think also, we showed at that thing, because you know, at the event, it was the Pixar RenderMan Arts and Science Fair. So each year at SIGGRAPH, Pixar hosts an evening where they do some fun lectures for the visual effects community to show them what they've done and where things are headed. And what they showed them too is we started, they started elemental.
I might get this wrong, but three or four years ago, that is when they started it. And so when you start a film, it's locked into the technology purchased at that time, and it's locked into the version of the software at that time. And you could understand why they would do that, because it mitigates the risk of something going wrong if you enter in another piece of equipment or another version of a piece of software, it could derail the production, so they lock it in.
So they... they showed a side by side the frame of Elemental, their most ambitious film ever. And that same frame of Elemental rendered with the newest version of RenderMan and the newest version of AMD CPUs. And what- it took- what took three hours to render one frame of Elemental with the previous generation of AMD tech and the two previous versions of RenderMan now took, it was three hours down to three minutes.
Pretty, isn't that insane? That's an insane leap in performance. But it was also because they had rewritten their software or augmented their software to scale with the cores and threads. You can think of, they're like mini computers inside of a CPU. If I was smart, I would have brought one with me to show you.
You know, just think of a metal plate that looks like this. And you can find an image. You can find an image. Yeah. But now they're, they're render man software scales, what we say linearly with whatever cause and threads are in that system. And so, and so that's the, the leap was, was largely due to that as well.
But that's, that gives you a good example of the evolution of, um, of technology and how quickly we are, we are iterating. And I think too, it's great for the artist. Imagine if Leonardo da Vinci could have spent more time doing the Sistine Chapel. You know, if you have more time with the art, it's gonna equal better art.
And I think that we've become a big part of helping the artists create better eyes because we give them more time with it.
Yeah, it's fine. I actually had a question similar to this, but with the rapid advancing of just how fast and improved technology keeps getting and Pixar is always like, yeah, there's always pushing the envelope with each film of like, what's technically technologically possible.
If you were to buy an off the shelf or just build your own off the shelf computer, you know, with like an AMD, a Threadripper or something, what kind of Pixar film from the past do you feel like just some person in their bedroom could make today using, like not having a render farm, just having a regular PC.
I
I reckon you could do Toy Story. You could do Toy Story on it. If you get a- Toy Story for sure, yeah. Yeah, Toy Story, maybe Toy Story 2, I don't know. Sorry, Ash Brannon, if you're listening. Ash Brannon is an acquaintance of mine. He directed Toy Story 2. We've often thought about that. And I think in some of our conversations and maybe even publicly, I think Pixar might've said something about how you could render Toy Story one or two.
on just one of these systems now. And it would also be interesting to go back and watch Toy Story 1 because it might look like previs. I don't know.
I remember when 4 came out and then all those comparison images were coming out of an animal and they were showing a cat from 1 and it was just a very hard surfaced kind of cat.
And then the cat from 2 that looks very realistic with the light and the fur and everything.
Look at the fur in the first ones. If there was any fur or characters with hair, they wouldn't have moved. It was like a helmet, like my hair, like a helmet.
Yeah, like a
Matted, matted fur. My head doesn't move. Yeah.
Yeah, yeah. But yeah, now it flows. Because it was very, it would have been very, very expensive back there. Actually have a Toy Story poster. Where we're based in L.A. is where everybody comes to discover new technology, RFX. So Ray Feeney, he's the person that brought Silicon Graphics into film back in the 80s.
And so this is where most of Hollywood comes to discover new technology and if I turn my camera around You can see There's a Toy Story Toy Story poster in the middle of the frame, but you can also see Star Wars over there Apocalypse Now And they're all signed by the original actors. Yeah. Yeah, but this is where so and you could also see down there, We have all these different systems This is where we're figuring out.
This is what I mean. Like in, we're not just selling CPUs and GPUs into Hollywood. We're figuring out with the studios, what works, what doesn't work. Um, and so we're not just a tech provider. We're also tech advisors, which I think is, is what, is what the industry needs.
Last one. I saw this into this thing, uh, from an article and, um, I don't really know what it means.
I don't want to explain it. You mentioned that AMD is now in the FPGAs. Field programmable gate arrays are in all RED cameras and all Arri Alexa cameras. And that is doing processing in camera to help speed up things and take things away from post. Can you explain what is going on here and what exactly this is?
So yeah, an FPGA is a is a is a seat is a is a chip that has a single purpose. So, and it's bespoke for the customer. So for Ari in red, for example, that's the brain in the camera and it has one purpose. And in that case, it's processing the imagery. And so I'm not an engineer, full disclosure. So that from the bird's eye view, that's what an FPGA is about efficiency.
So it has a single purpose. In our Alveo cards, it does upresing and downresing in real time and has AI right on the chip. And so working with the camera companies like RED, who's just down the road actually, RED are two blocks over. And we're particularly close with them. Because it's our technology in those cameras, it makes it endlessly easier to do things in real time and also in post production. So as something's shooting, it can then also be uploading that footage in real time to the cloud on AMD CPUs, and we can start post-processing. Because as you know, it would be edited, something would be edited all over the world. It also opens up the capabilities of, if it's our technology inside the camera, think of the virtual production capabilities of
some pulling something out of a frame, putting something in it, those things you can play with virtual production if you're shooting it with a RED or Arri. So maybe where, and this is just hypothetical, but these are things you could do is with a green screen. You could have that green screen and these are just arbitrary numbers, but say the green screen is going at 48 frames a second or 120 frames a second, every third of a second.
that could just be green. So therefore you could do different things in the frame or you could edit post-production. You could have on one hard drive, you could have a green screen pass and in front of the background pass without reshooting anything. So there's all sorts of different capabilities that- of us being in the brains of those cameras, being the brain of those cameras. I kind of answered your question, right?
Yeah, and I think that's just something that wasn't quite, I mean, A, I did not realize that there were A and B chips in the cameras and then also that, you know, we have this possibility of doing processing or like possibly like comping or doing post-work as we're filming, which yeah, I think obviously speeds up a lot of things and opens up a lot of possibilities.
It's a field programmable gate array. So is what FPGA. It's basically a chip that can be programmed for a purpose. And it's typically done bespoke for a customer. It's uniquely done. But yeah, no, I think it's part of the reason why we acquired the company is all these use cases. Media is used all over the world.
It's not just media. It's media and entertainment, yes. But media, you've got Kaiser Permanente, you've got BP Oil, you've got, you know, Burberry. I'm just picking random brands, they all use media in some form or fashion. Accident scene recreation, I should say, not recreation, recreation. Nighthawk or something.
You know what I'm saying? Yeah, but media, take the media and entertainment, take media out of media and entertainment. It has... much wider use than just entertaining people that has practical use cases.
Yeah. Imaging imagery is as a wide use outside of just Hollywood. Yeah. Yeah, yeah, yeah. Well, I appreciate the time.
Was there anything else you wouldn't cover that was worth mentioning on AMD's radar or AI or anything in general?
Well, watch the space. But I'd love to say that don't Just get your opinion on AI and where it's headed from listening to someone else's opinion, right? Go and read about it in multiple different sites, magazines, whatever.
Don't just get your view of AI from one source. Get it from multiple because it may not be what you think it is. You know? And it may. Yeah. And it's going to help. It's going to help you to understand. Everybody should seek to understand when it comes to AI and don't do that from just one source.
Yeah. I like best way.
Just get your hands involved in the mess around with a little bit.
Yes. Exactly. Yeah. It's lovely to talk to you, Joey. Thank you so much for having me.
Yeah. Likewise, James. I appreciate it. Thanks a lot. All right. Cheers. And that is it. Thanks a lot for watching. And thanks again for James for coming on to the podcast.
If you enjoyed this episode, please give it a thumbs up. And for more stuff like this, be sure to subscribe both to the YouTube channel and to the VP Land newsletter, vp-land.com, to get our newsletter twice a week in your inbox with all sorts of stuff. Thanks for watching. I will catch you in the next episode.