VP Land

Breaking Things to Learn: Inside RIT's VP Research & Education

April 11, 2024 New Territory Media Season 3 Episode 13
Breaking Things to Learn: Inside RIT's VP Research & Education
VP Land
More Info
VP Land
Breaking Things to Learn: Inside RIT's VP Research & Education
Apr 11, 2024 Season 3 Episode 13
New Territory Media

In this episode of VP Land, Aaron Gordon from Synapse VP and David Long from RIT dive into the current state and future of virtual production. Learn about the challenges of LED stage design, how they're teaching virtual production, and how emerging tech like AI will impact filmmaking.

RIT | Certified
Get Certified in Virtual Production 
http://ntm.link/RITcertified

📧 GET THE VP LAND NEWSLETTER 
Subscribe for free for the latest news and BTS insights on video creation 2-3x a week: 
https://ntm.link/vp_land

📺 MORE VP LAND EPISODES

This indie sci-fi film used miniatures
https://youtu.be/n9FnPxlra-I

Inside the ZEISS CinCraft Scenario & Nano Primes
https://youtu.be/4nnkkR34WwY

This music video tested out every new piece of filmmaking gear
https://youtu.be/vFFqiyU2wUY


Connect with David @ RIT:
Website - https://www.rit.edu/magic
Twitter - https://twitter.com/RITMAGIC
YouTube - https://www.youtube.com/@ritmagiccenter
Facebook - https://www.facebook.com/ritmagic
David @ LinkedIn - https://www.linkedin.com/in/davidllong

Connect with Aaron @ Synapse VP:
Website - https://www.synapsevp.com
LinkedIn - https://www.linkedin.com/company/synapse-vp
David @ LinkedIn - https://www.linkedin.com/in/aaron-gordon-os


David Long is Director of the RIT Center for Media, Arts, Games, Interaction & Creativity (MAGIC) and MAGIC Spell Studios. David joined the faculty of the School of Film and Animation at Rochester Institute of Technology in 2007, where he also serves as an Associate Professor for the motion picture science program.  His research interests at RIT include engineering multispectral video capture and display systems, studying variability in human color vision for artistic applications, and modeling sensory perception of motion media.

Aaron Gordon is the founder and CEO of the forward-looking technology innovation firm, Optic Sky Productions. Aaron built a global team of award-winning directors, digital artists, storytellers, producers, and creative technologists that has reaped double & triple-digit growth each year. He has also dedicated much of his focus into education and helped nurture the next generation of filmmakers by positioning them in the bleeding edge technologies our industry is undoubtedly moving toward. 

Toward that end, Optic Sky partnered with Rochester Institute of Technology to create one of the first Virtual Production curriculums in the country.

#############

📝 SHOW NOTES & SOURCES

RIT MAGIC Center
https://www.rit.edu/magic/

Synapse VP
https://www.synapsevp.com/

RIT MAGIC Spell Studios
https://vimeo.com/user172407267

College of Art and Design at RIT
https://www.youtube.com/@CollegeofArtandDesignatRIT

Visit vp-land.com for the complete show notes.

#############

00:00 Intro 
01:48 David Long's backstory and RIT
06:25 Aaron Gordon's backstory
09:07 Designing RIT's VP curriculum
12:35 The Synapse VP stage
17:59 Transferable skills in VP
19:07 RIT's immersion course and curriculum
24:46 Technical skills for cinematographers
28:38 Scanning and photogrammetry
31:25 Color challenges in VP
36:12 AR/VR in virtual production 
41:20 Photogrammetry, NeRF, and 3D scanning
58:35 AI in virtual production
01:05:25 Outro

Show Notes Transcript

In this episode of VP Land, Aaron Gordon from Synapse VP and David Long from RIT dive into the current state and future of virtual production. Learn about the challenges of LED stage design, how they're teaching virtual production, and how emerging tech like AI will impact filmmaking.

RIT | Certified
Get Certified in Virtual Production 
http://ntm.link/RITcertified

📧 GET THE VP LAND NEWSLETTER 
Subscribe for free for the latest news and BTS insights on video creation 2-3x a week: 
https://ntm.link/vp_land

📺 MORE VP LAND EPISODES

This indie sci-fi film used miniatures
https://youtu.be/n9FnPxlra-I

Inside the ZEISS CinCraft Scenario & Nano Primes
https://youtu.be/4nnkkR34WwY

This music video tested out every new piece of filmmaking gear
https://youtu.be/vFFqiyU2wUY


Connect with David @ RIT:
Website - https://www.rit.edu/magic
Twitter - https://twitter.com/RITMAGIC
YouTube - https://www.youtube.com/@ritmagiccenter
Facebook - https://www.facebook.com/ritmagic
David @ LinkedIn - https://www.linkedin.com/in/davidllong

Connect with Aaron @ Synapse VP:
Website - https://www.synapsevp.com
LinkedIn - https://www.linkedin.com/company/synapse-vp
David @ LinkedIn - https://www.linkedin.com/in/aaron-gordon-os


David Long is Director of the RIT Center for Media, Arts, Games, Interaction & Creativity (MAGIC) and MAGIC Spell Studios. David joined the faculty of the School of Film and Animation at Rochester Institute of Technology in 2007, where he also serves as an Associate Professor for the motion picture science program.  His research interests at RIT include engineering multispectral video capture and display systems, studying variability in human color vision for artistic applications, and modeling sensory perception of motion media.

Aaron Gordon is the founder and CEO of the forward-looking technology innovation firm, Optic Sky Productions. Aaron built a global team of award-winning directors, digital artists, storytellers, producers, and creative technologists that has reaped double & triple-digit growth each year. He has also dedicated much of his focus into education and helped nurture the next generation of filmmakers by positioning them in the bleeding edge technologies our industry is undoubtedly moving toward. 

Toward that end, Optic Sky partnered with Rochester Institute of Technology to create one of the first Virtual Production curriculums in the country.

#############

📝 SHOW NOTES & SOURCES

RIT MAGIC Center
https://www.rit.edu/magic/

Synapse VP
https://www.synapsevp.com/

RIT MAGIC Spell Studios
https://vimeo.com/user172407267

College of Art and Design at RIT
https://www.youtube.com/@CollegeofArtandDesignatRIT

Visit vp-land.com for the complete show notes.

#############

00:00 Intro 
01:48 David Long's backstory and RIT
06:25 Aaron Gordon's backstory
09:07 Designing RIT's VP curriculum
12:35 The Synapse VP stage
17:59 Transferable skills in VP
19:07 RIT's immersion course and curriculum
24:46 Technical skills for cinematographers
28:38 Scanning and photogrammetry
31:25 Color challenges in VP
36:12 AR/VR in virtual production 
41:20 Photogrammetry, NeRF, and 3D scanning
58:35 AI in virtual production
01:05:25 Outro

David Long:

This goes back to the very beginnings of color imaging. The minute we were able to get away from monochromatic, get away from black and white, we've done this dance between color accurate and color preferred and color embellished. When the physics of VP intrude upon your creativity and bound you a bit, it's frustrating.

Joey Daoud:

Welcome to VP Land, the podcast where we dive inside the tools, projects, and people that are changing the way we are making movies. I am Joey Daoud, your host. In this episode, we chat with Aaron Gordon from SynapseVP and David Long from the Rochester Institute of Technology or RIT.

Aaron Gordon:

We reached a point where one improving is not the sacrifice of the other necessarily. And you can get a stage if the pixel pitch is extremely small, the processing power has to be way, way, way, way, way higher, exponentially higher. But if the color science gets better, it doesn't mean the processing power has to get exponentially higher.

Joey Daoud:

We dive into the current state of virtual production, from the evolution of LED panels to the challenges of balancing pixel pitch, color output, and processing power.

David Long:

Panels are more and more being designed to be photographed directly. But remember, this tech was supposed to be outdoor billboards.

Joey Daoud:

We also explore the future of virtual production assets and pipelines. and how they intersect with other mediums.

Aaron Gordon:

Today we have stock footage, tomorrow we're gonna have stock 3D assets. I think I always tell people I think we're in a 2D asset pipeline world right now and it's transitioning to 3D asset pipeline.

Joey Daoud:

Plus we talk about how they're training the future creators in virtual production and a whole lot more. Links for everything we talk about are available in the YouTube description or in the show notes. And be sure to subscribe to the VP Land newsletter to stay ahead of the latest tech changing the way we're making movies. Just go to vp land. com. And now let's dive into my chat with Aaron and David. David, Aaron, thanks for joining. Appreciate it. Uh, I would love to get a bit of backstory for each of you and how you kind of came into virtual production. Um, so let's start with David, if you want to kind of give your. Story first, and then we'll go to Aaron.

David Long:

Yeah, sure. So, long, long ago, I was a research engineer at Eastman Kodak, so I've been in the motion picture biz on the tech side for over 25 years now. I spent 10 years working on camera origination films, digital intermediate systems, scanning technologies, and parlayed that into an interest in getting back to academia. And starting an engineering program at the School of Film and Animation at RIT, which is where I met Aaron. He was one of the students early on in my career. So that was 2007, and our program really focuses on preparing technicians, engineers, the technology and services side. of production and post production and we've had great success in our graduates going out and contributing in that space alongside our fabulous creative students who go out and get to contribute in their talents. The affinity to virtual production for me really began as a research pursuit. I've been inspired by Rob Legato and others who were working on and pioneering performance capture systems and virtual camera systems and using real time rendering as pre visualization. But you could see that it was very, very quickly going to end up in live in camera visual effects. And as an engineer, of course, I'm hugely intrigued by that. My background is in imaging science, image systems engineering. So, you know, not so much, I would say the graphics layer traditionally, not so much the render layer, but absolutely how the camera interacts with that imagery. And all of the attributes therein in creating a beautiful cinematic image. So two ways we do virtual production at a university like RIT, we're either looking to prepare our students to go out and be competent and capable in the field when they graduate, or we're looking to contribute to fund technical problems from a research perspective. So we jumped headfirst into both. And have attempted to find great partnerships with groups that have interesting technical challenges, but also very intentionally worked with Aaron and his team, uh, back in 2021 to stand up curriculum in virtual production. So the students in the School of Film and Animation at RIT, both the engineering and tech students, as well as our fine art filmmaking students. Could be completely capable, jumping into the industry, contributing immediately, and hopefully actually entering at the forefront of expertise and knowledge. Uh, and Aaron and his team were, were instrumental on that. So that's, that's my long backstory into VP where we are now.

Joey Daoud:

I feel like that might've been a short version of everything you've been in. Yeah. I kind of want to dive into the curriculum a little bit more, but just, I'm curious, high level, are you running the virtual production program as a separate program or is this part of like, if someone wanted to go to quote film school and also learn about just cinematography and stuff like that, is it? One program? Separate programs? Yeah. How are you handling that?

David Long:

Yeah. So there's actually three prongs. There's, there's three answers to that. So it, it all began immediately with preparing coursework that was accessible to enrolled students at RIT. And those students come across background, again, from engineering, imaging science up through cinematography and filmmaking, but also 3D designers. Uh, graphic artists, others who would play in the virtual art department space. And that coursework is of course a, uh, focus track or an elective track for students here. So it would be a part of aptitude they could gain during earning their degree. However, that curriculum was developed in support with Epic Games. We partnered with Aaron and some other alums of RIT, notably The Third Floor, with Chris Edwards who's an alum, their founder, The Third Floor's founder, and we took the experience of that curriculum and we actually published publicly. The lessons of that curriculum's development, and that was Epic's interest, that we as a proper film school could begin educating other film schools in how to deploy this curriculum. So that the second opportunity is through other partners who've latched on to that dissemination, that information that we've contributed in that capacity. But the third, which is, you know, the next phase of excitement here and with Aaron joining us as well, is taking that education out to the professional industry, taking it out to folks who are current participants, uh, upskill services, certification services, bringing that curriculum and our experience with it out to allow practitioners out in the industry to gain some skill and thus gain some access into those workflows, into those sets that are, that are adopting this technology.

Joey Daoud:

Nice. And yeah, we'll talk about the immersion course in a sec, but, uh, yeah, Aaron, you want to jump off. from, so you were a student at RIT and then what happened? It was,

Aaron Gordon:

man, so many things, you know, but luckily, um, uh, and I say this all the time, you know, I think RIT has kind of been there pretty much every step of my journey as an entrepreneur since even since school, but I left school, I started a live action production company really focused on advertising called Optic Sky Productions. And for its first few years, it was very much so traditional live action production, broadcast commercials, online commercials. And, you know, we kind of got the bug to try some new stuff out. We were trying to kind of even out the offerings. And, you know, we had gotten into posts, but we really wanted to kind of figure out at the time just different offerings that were just going to keep us kind of on our toes. And we early on got into augmented reality and virtual reality. And when people talk about like luck moments in their life, I think the number one luck moment was, I was actually at the creative fair at RIT, had this idea in my head, knew what we wanted to do, and we ended up hiring pretty much on the spot two students that had done their thesis film, it was a VR film, and it was really, it was really gorgeous, and at the time, So they, they did an Unreal Engine. And at the time, you know, what people don't like recognize wasn't necessarily that's like an obvious statement, because a lot of people were going head, head over heels for Unity at the time. Unity was just going public. There was a lot of buzz around Unity. But we're really lucky that Uh, in retrospect that we made that choice with those students who happened to be comfortable with that software. So as a company, we were like, well, if these are the students we're hiring, we're going to, this is the pipeline that we'll do. This is what they're familiar in. And we really pursued a lot of, you know, VR and, um, and augmented reality early in the space with Unreal Engine. And then what ended up happening of course starting out as a live action production company. You know, we started to hear, Some of the buzz that was happening around the potential virtual production and we were really lucky that we got this call from David. He's like, I got this Epic mega grant. We want to do this incredible thing. And we really had kind of dove into it a little like ideated, right? But there was, you know, like anything else, it's best when there's an application and there was a, there was a metric of success. There was an application and it was kind of like dive in early. Yeah. And, you know, as David's crew likes to say, break things to learn things and fix things, right? So it's like, you know, it was, it was basically like, I mean, you know, it was ground zero, R& D, all of us just really enjoying kind of doing that. And it culminated, of course, into, to do that, to then figure out some best practices, to then teach it, to then, you know, this is before there was any standards. This was before there was any guides that existed.

Joey Daoud:

I mean, do you remember like roughly what year this was? Was this, yeah. Do you remember what year this was?

Aaron Gordon:

Was this, was this 20 and 21?

David Long:

This was prep in 20. We were notified by Epic about a month into everyone getting sent home for the pandemic that we had earned the mega grant. And we somehow still returned to campus, uh, students masked up and delivered the very first course in the spring semester of '21. So, we're coming up on the third year anniversary of of doing this live with students.

Joey Daoud:

Okay, and figuring out as you're going along. This is like 2020s around the time Mandalorian comes out and virtual production starts getting on everyone's radar.

David Long:

Absolutely.

Aaron Gordon:

Yeah, exactly. And what's crazy about the time too, if you think about it this way, is there was no maturity level to it yet, right? I mean, that was such R&D. You even look in retrospect and hear all the stories about Mandalorian, like, you know, everyone who breaks ground, you hear about the ground they broke, you don't hear about the tools that got broken. So like, you know, it was a really interesting time because they did such a fantastic thing for the whole market, but there was no standards because they, I mean, if you hear interviews with people talking about like, they were figuring it out as they went, right? There was no established scenario there. So thinking about the kind of insane challenge that David had brought to his team and to our team was like actually establish some standard via a curriculum at a time when there really wasn't. And while live broadcast has long since had a lot of standards that are very helpful in this technology and in a lot of ways live broadcast deployment is a lot more complex than cinema needs for this, um, for ICVFX. But, you know, definitely was one of those things. kind of big moments where you go, wow, we're really lucky that we know the engine really well, because we've been developing it for two years now. And this is just kind of a new, it's just a new application of that. And on the other hand of it, we're going, we're even luckier because this is a tool that's going to help the live action, the action of the company. And what's really crazy on kind of the evolution of from there to today was because of that whole We got this random call one day from a bunch of guys in California through a mutual contact that is now one of my business partners, Geoff Knight in Rochester. And, uh, they were like, Hey, we want to build one of these things in the heart of Hollywood. We were like, Oh, cool. You know, this, that, that sounds awesome. And, uh, Had no idea of the scope of the project at the time, you know, and then, and then we get over there and, and it was, um, at the time really a, a, a large stage unlike, um, a lot of the design elements that people had seen mostly because a lot of carbon copies of the Mandalorian style stage were going around. And so we got a really cool chance to be a part of that. But then what ended up happening was it just became a bigger and bigger vision of what kind of that could be rather than just a single stage, rather than just You know, this or that. There was kind of this holistic approach aspect to it and it just, it had to become its own thing. It was such a big idea and so I was really lucky and still grateful to this day that I met the right people at the right time and six partners came together all from really awesome different backgrounds to form what is now Synapse Virtual Production. And even now, what the best part is, again, full circle, we have Synapse that's doing really, really well in LA. We're starting to play some of our build outs in other locations as well. And we get a call from RIT and they're like, Hey, we want to now do this certification program. And we were like, yes, absolutely. You know, like it's just absolutely. So it really does come full circle with the whole RIT relationship.

Joey Daoud:

Yeah. That's awesome. What, uh, so when you were designing, you were thinking of the Synapse stage, what were some learnings or what were some things that you differently in that stage or like, what did you. Or like different scenarios, like what were some of the things that you kind of thought about when, from what you learned into building that stage?

Aaron Gordon:

So there's so many that it's hard to like do the short list, but I'll try to do the short list because it's, it's really for everyone. The benefit of all the partners, you know, like we have, one of the partners is like Christopher Probst ASC, who's, you know, the incredible cinematographer, Rich Lee is the director, Justin. Uh, Diner, who's an incredible producer and EP and had just produced, uh, This Is Me Now, which just came out on Prime and, you know, very VFX heavy background. Had a, had a post house, um, for a while as well. I, I come obviously from advertising and Geoff comes from live broadcast and huge, huge, uh, e-sports live broadcast and Duke who came from hydraulics. Um, so there was, you know, uh, there was a, a, just a crazy room of people thinking about all these things. And I think the, the biggest conclusion is that we came to was, okay, no one's asking why did Mandalorian build out what they built out. They're just saying, this is a cool new technology, but the hardware setup It's a totally different application of the software pipeline, right? The software pipeline with the hardware enabling it was amazing, but the actual design of the stage itself was created for one purpose and one purpose only. A guy was wearing a chrome helmet and there was reflections in every direction and they needed to cover as many of those reflections as possible practically. And so when we were thinking about the stage, there's a couple other things that you don't think about in that show. The easiest one is sound. That stage is an echo chamber, but everything on that is ADR because he's in a chrome helmet and he can't see his face. And so a lot of carby poppies were popping up around that time. That was an echo chamber, but no filmmaker wants to walk in and you say, you can't do sound in here. And I think that became kind our Okay. Yeah. The MOS stage. Exactly. The VP, ICVFX, MOS, you know, like however many letters you can do. But we, we were basically like, okay, how do you build a stage where, where a filmmaker isn't being told they can to do something. And I think, um, that was kind of the, the premise of asking all the right questions. So we changed the shape of the stage. And this being a lot of people we're doing at the time, we were at the right time of technological development, you know, three years made a big difference. So there was new LED products coming out into the market. That had better viewing angles than the products that were used on those seasons of the shows, that had some better color science and color pipeline capabilities, uh, had better energy output, uh, or input and output, you know, way more, way more green and a lot less heat problems. And so, um, not only we changed the design of the stage, we kind of went against the grain of the market. It was kind of carbon copying a lot of what they saw. For choice of product. And, uh, the biggest thing that we became known for kind of on the beta stage at Sunset Las Palmas was our seamless ceiling to wall transition. You know, a lot of people at the time were using outside in camera tracking, you know, uh, a lot of companies like OptiTrack, other things, were using that. They're incredible systems, but the problem is you don't really achieve final pixel on stage when you have to roto out every single one of those cameras every time you want to look up. And you know, when you're talking about film and TV versus maybe commercial work as much, especially cinema, a lot of people love their wide angle anamorphic lenses, right? And so, the second you want to look up, it's a hero shot. You're seeing a huge background behind them, which means you're going to see that scene. And so, we actually ended up doing was we did the same product on the ceiling as we did on the wall and we created a seamless gap so that when the camera, through the camera, when you actually look up, you know, you, you can actually shoot up at final pixel and you're not worried about, you know, having to rotor things out the whole time, you know, uh, all the, the pains that come and post. And so those are some of the design aspects of it. Um, that changed a lot at the time what people were thinking at all. The other aspect of it though was also just, it was a really good time where because standards were, you know, So, considering efforts, uh, that were happening at like, institutions like RIT and some other professional efforts, um, in the market, there was the realization that what was 30 people maybe isn't 30 people anymore except on very specific applications. And so, it was also, how was the pipeline built to staff up? In a way that, you know, isn't quite the same way as it was before, because every single person was, I mean, if you think about the engine, not to like go trail off too much, but the amount of software updates that have happened since the Mandalorian started, and they weren't even on Unreal Engine when it started, by the way. They were on the proprietary software. But the engine itself, when we, when we first worked at RT, uh, on this stuff, the stuff we had to manually code at the time is now a bug. You know, so if you think about it that way, like, all the things that were like, oh, I have to make a, I have to code a huge thing to make this one thing happen. That's now just a button that you click. Those are people's jobs. You know, all those, uh, tools that weren't standardized, that weren't in there. And so, you know, we came in, I think, right around the right time where we go, we, we can build it for what it's going to be, not for what it's going to be. Um, which also allows a startup, you know, to, to build faster rather than, you know, uh, having to staff up 30 people just to do one production well. So, that was another thing that we kind of had to rethink in terms of pipelines, but still make sure that we're servicing cinema quality because that's where a lot of the client, a lot of the clientele for us has been. Um, so, yeah, those are, I think, just a few of the things and they've been really, really fun. And then we have our newest flagship stage. Over at Los Angeles Center Studios, we learned a lot of lessons from the beta stage and we just keep improving.

Joey Daoud:

Yeah, that's awesome. So you were saying with the teams and stuff, so something that maybe a few years ago might have taken a team of 30 people to run Unreal or to run the volume, takes less people today, just workflows have been worked out, software's improved, stuff like that.

Aaron Gordon:

Don't get me wrong. You know, depending upon the show, it is going to dictate, you know, the actual crew size. But, in terms of what the general trend has been is, the crew can shrink. And I think part of that promise of flexibility and control for the direction of DP for virtual production isn't just about what they can do, it's about how fast they can do it. And, you know What did take a lot more people to do something fast now? Yeah, it takes a lot less people to do that same thing. And in fact, sometimes that redundancy is not helpful to have more people, um, and you're seeing a lot of, uh, I think a lot of movement towards, uh, less people being able to operate this stuff a lot more seamlessly.

Joey Daoud:

Let's jump into, let's go over the immersion course you've got going on and I kind of curious what the curriculum is and that's kind of a good branching off point for like more questions and stuff. But yeah, tell me about the immersion course and sort of how you what the curriculum is and how you figured out whittled down like that. This should be what you should focus on if you want to get into virtual production.

David Long:

Sure. Yeah, scienceLove We've obviously not been doing it for a decade. The workflow has not existed for a decade. So we've been proving here on campus the touch points that are most important to make the most capable, immediately capable graduates to jump out and be able to contribute to these workflows. So for us, we break down education and virtual production into its three core components. I think others in the industry think of these three core as well. You've got the virtual art department, your creatives who are responsible for authoring an engine, creating the environment, mapping the environment, considering how that is going to convey in context of the real life. Action, props, uh, set decoration in front. Next you've got volume control. This is engineering heavy, right? This, this is my bread and butter. This is stuff I love as an imaging scientist and color scientist. I'm thinking about color management. I'm thinking about even some of the stuff Aaron was alluding to. Uh, panels are more and more being designed to be photographed directly. But remember this tech was supposed to be outdoor billboards, right? That's where it started. And then of course there's the great projector, uh, supplanting theme we all see, eventually these direct view LEDs are going to be movie theaters. They're not, we're not going to have projectors any longer. So they're getting better, but they were never meant to be photographed. And so you've got all kinds of angular dependencies and weirdness and you volume control people have to know. How to operate kind of core theory, but they also have to be disappointed on the edges of capability and be intrigued to investigate and engineer new solutions and go work at vendors who are engineering new solutions. So we've been intentional to try to train people with that mentality as well. Uh, not just the operating perspective, but the engineering that the operators rely on and where the strengths and weaknesses there lie. And then of course the last of the three segments is creative. You sit down with storytellers. I think VP has matured to the point that creatives absolutely understand it's a fabulous tool in some cases and it's absolutely atrocious and inappropriate in others. And it's not different from the dawn of chroma key VFX, um, you know, CG heavy films where You had the early adopters. You wanted to make the entire thing through that workflow. And then you realize this isn't serving storytelling, right? This is, this is not as photographic as I want. This is not, uh, laid out in a, um, in a, in a temporal sequence like I want. The, the environments aren't what I want. And so. We spend a lot of time with our cinematography students, with lighting students, with art designers, really thinking through, okay, what is a VP shot and what is an on location shot? Or what is even a traditional soundstage shot? And, uh, thus communicating that, that creative element. So those three pieces are the core of what we've been exploring and trying to balance properly. Now we, we throw in, uh, auxiliary workflow elements into that. Our curriculum is really heavy on pre visualization, specifically because there is no engine learning like learning pre visualization, right? If you can craft a cinematic shot in Unreal Engine in a pre vis session, you're going to have most of the basics down for then operating that with a virtual production. and driving properly through volume control. Obviously, when you're on a stage and there's a big lit up wall and you have to geometrically know where it is and you have to track the camera and that stuff, it's a little different. But considering the shot, pre visualizing the shot really exposes you to engine and that's an important element. And then in volume control, rounding out with a lot of intentional attention to Engineering and science topics. So, we teach film students, we teach engineering students alike how to properly color characterize walls, we talk about optical distortions that are common in VP, right? We all know that if you're focused directly on the wall and you have, you know, a lower, a lower pitch or a lower resolution on that wall, you're going to get tremendous success. Uh, Aliasing and Moray Artifacts, so you'd have to be conscientious about that. Uh, the Color Topic and the Moray Topic both feed into Creative, right? Okay, there's going to be limitations in what and how you can shoot. And if you don't expose the students to the bounds of those operating parameters, They may not be properly prepared to be efficient when they go out there. Aaron alluded to it earlier. We have a big slogan on our, on our studio research lab here on campus. It says we learn by making things. But as an engineer, I've replaced the making with breaking because I learn a heck of a lot more when we break things. So this curriculum that we design, it forces the creative students to actually pick up things like. Colorimeters goniometers, which basically determine how the light falls off as a function of viewing angle, which, you know, if you're moving a camera around a VP set, that doesn't happen in real life. And it's pretty annoying when it happens on a wall. So you have to anticipate it. They're forced to do that. Similarly, the engineering students and the tech students are watching the creatives Really try to push and understand, you know, what's going to work, what's not going to work. So that intentional intersection is what's key to us. Uh, another big theme here, uh, RIT is extremely proud of what we call TAD, which is Technology, Art, and Design. It's the intersection of those. And I see that, I think you see that pervasive in the curriculum. All that said, that character, that personality is exactly what we're bringing into the immersive course out in Los Angeles with Aaron and the Synapse crew. Taking that lesson of the three years here, we've been teaching it intentionally out to that audience.

Joey Daoud:

Going to the, the, down the technical. It makes me wonder. So like, I'm thinking back of just like film school where you're sort of like exposed to like every role and it's like, okay, well, a lot of that is just not to be like an expert at the end of the day in every role, but at least to be aware of like everything that's happening so you can communicate better and know what's happening better. Is that sort of the case with this? Like, I guess, depending on if you have a dream job or something where it's like, yes, I want to be a, uh, you engineer and manufacture the stage or operate them. But if you're like, okay, I'm a cinematographer, what's the amount of technical knowledge and additional technical knowledge? You need to get to be a cinematographer versus just being aware of like what's happening so you can communicate to the person who is actually doing that.

David Long:

Yeah. I mean, it's a, it's a continuum, right? And I'm, I'm curious to have Aaron weigh in. He's got a proper training in cinematography himself, but the cinematographers, the DPs, are definitely, to someone like me who's an engineer, they're definitely our partner on the creative team who knows the most and is required to know the most across that spectrum. They come the closest to us. And so we look forward to that overlap in, say, how we teach it or how we present it. Um, the beautiful part is the DP or the cinematographer is conversant with. Creatives who are much less technically astute, right? You got people who have pure creative responsibility, are fabulous contributors in their domain, but if you try to teach them color science, or if you try to explain, Hey, do you see why that zebra ish moire pattern is showing up on the screen right now? That's bad, and here's the equations which Define why. The DPs over there in our business say, no, no, no, I want to see those equations. I want to understand it so I can operate up to that limit but not approach it. The other people are like, can you just make that stop? It looks gross. Just, what do you have to do? So, being in a domain where that spectrum is represented and there's so much overlap, I think that naturally people gravitate towards their capability and their intrigue, right? So they You They're going to engage with a side of that spectrum that is smarter than them in a certain topic and really, really latch on to that. Uh, and because we don't, like when we teach it, and that's, that's absolutely going to be the feel of the, of the course with Synapse that when we go out to LA and offer it, we're not going to send people away who aren't properly technically trained to understand things, right? We're instead going to pair them up. That's been another. Tactic here in our instruction is to always pair up a creative student with an engineering student so that they can work on their communication and they can work on the limits of their mutual understanding and how they have to be able to converse with one another because they I mean, we all see that in actual production, right? So many departments don't understand. Why are we waiting? Why? We're ready. We're ready to go. Why does that department have the power to say we're not, we can't go, right? The actor wants to come out, we're ready. And I don't know what you do, but why are you holding me up? If there's a little more appreciation for The input that that department is, is, is representing and the correction they're trying to make or the quality they're trying to ensure is present in that shot, then it's, there's a little more respect across the full team, but you can't get that unless you're You're living it in simulation, if you will, and that's a great thing about learning in the classroom settings. Like, this isn't for real, right? This is simulation. This is your chance to ask stupid questions and to look silly and, and to, you know, right away represent that you're not an engineer, but it's okay because you're gonna have a lot of people around you who are intrigued to try to teach it to you in a way that you can fathom and understand and take away that lesson in it. By not segmenting people, by not keeping VAD off in their little lab and keeping the DPs off on the side and, you know, having the engineers off in their corner. They're all, they're all overlapping intentionally and communicating with one another.

Aaron Gordon:

Yeah, David, I can hit on that point about pairing up. So this is really personal to me because I'm like a prime example of someone who was RIT, was not trained to be technical. I didn't even go to college to be behind the camera. I thought I'd be an animator, you know, and. Uh, you know, like many kids that go to college, I totally changed my mind five times. But uh, you know, part of what happened to me there was I, I ended up pairing up with a bunch of, of the motion picture science students cause I just wanted to learn. And those guys were, at the time, I mean, this was like when the D21 era, D21 was kind of coming to play as a research camera that RTA had one of, it wasn't even an Arri Alexa yet, but so early digital cinema kind of hitting its stride moment. And so, you know, I couldn't tell you the, at the time. Now I can't, but at the time I couldn't tell you the first thing of how that sensor worked. Couldn't tell you the first thing about, you know, uh, why it was making a prettier picture than other things until I took a class with David, actually, funny story, but, um, uh, but the students coming out of that program could, and so you would pair up, and, um, it was a really great partnership from the standpoint, and I got really close with a couple of the students. Students from that program at the time who I took on every set with me. So, I was busy just trying to be creative and shoot and I would actually call myself a less technical DP and more kind of gut instinct, um, cinematographer at the time. And then as I really went into the professional world, I ended up, I made the switch because I'd been around technical people so long, I ended up becoming a DIT for a long time and then worked my way back into cinematography in the real world. And I think that type of fostering, pairing up that in a lot of ways is what makes RIT, RIT versus other institutions. I think it's the type of. Um, learning David point, David's point that we want to foster for the, for the workshop here is that it's very likely, I mean, there's a working professionals that are going to be doing this. So it's very likely that some people are coming in and they come from like a previous background. It's, it's likely that some people are coming in, they've never done, they never touched the thing and other people are just pure artists and other people are, are technicians. You know, it's not for us, I think any of us, I think to say. Here's what you're going to do with this when you walk out of here. It's just to give them enough context to know why they're doing this new thing. And, uh, and hopefully, I mean, the kind of really cool mix between, I think, some of the technical aspects that, that David's bringing into this versus, I think, some of the workshops that I've seen, which is a little bit more high level and doesn't necessarily get into some of the why. Um, you know, there's a lot of why in this. And with, by doing that, it makes the exercises we're actually doing it a lot more powerful. So. That excites me a lot because that's, I mean, from a personal level, that's, I think, what helped me in my career a lot is kind of having that duality and that partnership with technical people and just artists who know nothing better in the room. Hopefully we get to foster that together.

Joey Daoud:

Yeah, and I think that's a good explanation. How much, kind of going into like breaking things or making things, how much in the course is like hands on? Experimenting in a volume. How much is, uh, I don't want to say classroom, but like theoretical or like learning. We know too.

David Long:

No, no, it's, uh, and I'm glad you asked. Uh, it is almost entirely experiential, right? I mean, there's some stuff you have to communicate in a classroom setting. Uh, we, it's a bit cliche at this point, but it is effective. We do embrace the flipped classroom model, right? So if you're literally learning how to open up a piece of software or you're being shown how to walk up to a camera and turn it on and gain experience with controlling it, you're going to do some stuff before you're in the workshops with us live. Uh, in studio or on stage, so you can have a bit of proficiency, but then we're also going to have a lot of, uh, studio creative time and critique time. So, for example, an early exercise is to help you master the cinematic interface in something like Unreal Engine is to do a pre visualization sequence, right? You've got to figure out how to control that virtual camera. You've got to learn the film back. You've got to learn the lens parameters. You've got to learn the different things that are in the render to help convey photographic sensibility through the eyes of a cinematographer. Similarly, you've got to place lights and those lights have different geometric behavior, different radiometric behavior. And, and then, oh, by the way. You're going to have some talent, right? And you're going to either go to the marketplaces and get a fabulous environment and maybe a bit of a performance capture sequence. Or if you're really bold and have the background, you might try your hand at animating that talent in your, in your pre visualization scene. So you're literally, Going to get exposed to those basic tenets and pre visualization that translates to master your expertise with the tool itself. And you're going to do so in the context of a creative prompt, right? It's like, we need you to make a pre visualization of, and by the way, we're going to bound to this a little bit. We have some overview creative that will dictate that everyone's kind of interpreting in the same theme creatively. So we're not going to get, you know, You know, a horror movie on one part and a preschool animation on the other. We're going to have intentional bounding, but we're going to let people play. We're going to let people play with camera angle and edit and timing and movement and lighting variation and things like that. Because those of us who practice VP know that, you know, pre visualization is not just so the execs can see the film before the film's even done. There's a lot of tech viz in there too, right? So there's a lot of production problem solving that's happening in that. And so by bounding the creative story that we're going to have them tell during the course of the, of the workshop, we're allowing them to show us kind of their, their creativity and their, and their different perspective on things, uh, but while also translating it directly to activity when we port over to stage. And when we go over to stage, it's going to be they're operating, they're controlling, right? I mean, they're, they're controlling. They're going to be monitored, right? So, I promised Aaron we're not going to let any students break anything. No one's going to back a, you know, back a forklift into the volume, but, um, we're at least going to let them People have tried. People have tried. Absolutely. Always, right? That's every sound stage. Always. So, we're going to let them produce a piece by the end. The mantra, again, is you learn by doing. You learn by making. You learn by breaking. And so, the last couple days are literally a pre produced, and produced virtual production segment so they're not just listening to people talking about it the entire time they're going to be pointed at all right you know you're doing art you're doing camera you're doing volume control you're you know you're VAD consults right because when the DP says oh you know what I really wish that building was over there who do you go to you go to VAD right because that's the beauty of VP the building can go over there you can immediately five minutes later do a second take so everyone's going to have their chance to kind of practice that That role and really produce a piece.

Joey Daoud:

Yeah. That's awesome. And thanks for painting that picture. That's good. Yeah. You mentioned, uh, this is a little bit higher level, but you mentioned, uh, like previz, not just for execs, but also earlier you mentioned, um, you know, sometimes VP might not be the right tool for the shot. It could be a green screen. It could be practical. It could be something else. Have you, for either of you, have you encountered, I guess, more, cause this education is very much. Seems focused more like crew, DPs, people want to get a get in the nitty gritty of operating or doing virtual production. But if you've been educating or kind of encountering more people like producer level, director level, people that are making money choices that to kind of educate more on that end when they're like taking a script and trying to figure out how do we make this script, how to make those decisions for like. Should this be virtual production? How do we create this? And what's the best tool for that? Yes.

Aaron Gordon:

Yes. I, I can talk that a little bit. I, I think the majority of conversations that we have on a weekly basis is actually more on that. And I think, you know, the space is still early. It's Uh, being in it every day, it's tough to remember that sometimes, but the space really is so early. And so many people, I mean, we just had really incredible directors, uh, that are very recognizable and recently that you, you could not have imagined have not shown us yet, but they haven't. And they just haven't, because They have their way of filmmaking and, um, why change that when you have that level of success and we're bringing our producers in, we're bringing them in. The two main factors, I think that, to David's earlier point, I think DPs get it the fastest. I think DPs tend to be very technology forward and, you know, are always looking for the next toy. Um, I'm guilty myself, but I love my toys, but, um, I think for directors and producers, the part that we have to remember. Is that all they care about is that they achieve their vision and that they achieve that vision under budget, or at least on budget, and, uh, uh, and that the studio is happy or that, you know, their, their team is happy in that way. And so, the two major education points that I think, uh, go along with what you're saying that, um, is so important to keep happening that we do all the time is, uh, We'll do workshops for entire agencies or for, um, uh, studio partners or even, you know, huge teams of streamers that aren't already having their internal team, um, uh, established for this, uh, so that A, their, their producers kind of understand how to gut check some of these things, right? There, there are the signs that tell you this is a good idea and that don't, they're the efficiencies that happen when. You're back at a location 10 times versus you shoot a one off and you have to create a whole virtual world for a location you'll never return to again and the background is completely out of focus and it's two blocks away from where you're shooting the last scene. It may not make sense to go, you know, to go in the studio for that. And these points sound obvious, but the intricacies of, you know, when you're dealing with a huge budget show or a huge budget film, you know, where you can save a couple million dollars or where you can lose that same amount of money. You Uh, is in the details, right? And so, um, I think that that's a huge education point. I think for directors, there is, and I think for good reason, this kind of hesitation often about authenticity. Well, I, I just, you can't beat shooting on location. You know, our answer oftentimes when we're trying to educate people is, you're right, you can't, but that's your side of the story. Your producer might be going, we can't afford to shoot in that location. Or you can't necessarily beat the look in that location, but this location, the weather is going to screw up your entire schedule, or you're not going to get that sunning down shot that you want because you're going to have it for a half hour in the span of five weeks and you're, you're just gambling. And that's where you go, okay, that's where you come here, right? Like, like, think about that for a second. That's where we're coming here for the control that you want to get this exact shot that you're looking to do that you have to recreate somehow at sundown, you know, that's going to take clearly two weeks to shoot. So it's, it's things like that, where, you know, I think the education is going to continue to have to happen. And I think the other thing about any new technological space is that Every time something goes wrong, the press goes everywhere. When ten things go right, no one hears about it. Because when you do VP correctly, you shouldn't see it. That's the whole point. You should not notice it. If it's done very well-

Joey Daoud:

The majority of visual effects, yeah.

Aaron Gordon:

Exactly. It's exactly the same, you know, the same thing visual effects has always, has always suffered from, which is that when they do the job really well, in a lot of ways, it should go unrecognized. Not the people, credit wise, they should definitely be recognized, but the product should be unrecognized that it was a visual effects sequence. It should just be a really great scene. And so, I think that is going to continue to be a huge education curve. I think that over the next, you Three years, maybe four years. We've already made leaps and bounds over the past two, three years, I think, in this space. I think a lot of people are standardizing it. Um, a lot of the streamers are pushing it. I mean, we're talking directly with them on, you know, they're literally pushing their producers on multiple slates to make sure this thing happens. You know, they shoot these things this way because they understand how to deal with it now. Um, I think the other educational aspect, um, especially on kind of what's coming is, you know, we talked a lot about ICVFX. Uh, we're not covering this necessarily in certification, but really, the engine is getting so good at real time rendering animation in engine virtual production. Um, the other educational aspect is, is getting people to understand what they can actually do in that engine within those worlds. You know, besides the static virtual world you have behind you, the digital twinning, the, the, you know, the motion capture character, um, that you can live have, Running around in the world behind you, uh, with some of the suit running elsewhere now at high fidelity, it's like things like that where I think, um, today it is getting people to understand the basics and tomorrow it is going to be getting people to understand just how much you can actually do, um, especially as the technology improves. So, that's, for me, that's really, really exciting and I, I take a lot of pride in, uh, having gone to an institution that pushes the edge of that education and, and getting to do programs like this with, with people. with RIT continuously, which I know will keep evolving. So yeah.

Joey Daoud:

Yeah. With the new tech that you, um, mentioned and then the other example of filming on location quote on location versus like, we could just bring the location into a volume. Have you been experimenting or pushing or using new tech like photogrammetry, nerfs, like scanning an actual place, bringing it into a volume or a set? We've, I've interviewed some other people with examples where it's like, yeah, you don't have to do pickup shots. We can just scan the set when we built it and filmed it. And then if we have to do pickups later, we could do it in volume because then we'd have to rebuild the whole set. Has that, um, some of this newer kind of photorealistic scanning technology been on your radar or been in use? Yes.

Aaron Gordon:

Yeah. Uh, so I grabbed a cheat very early on that was on our radar. I mean, I think The, uh, depending on how you scan it, you always have different advantages, right? NERFs have their advantages of control, uh, photogrammetry has its advantages over fidelity, right? Like, there's just a lot of different ways to look at it. Uh, we've been exploring those a lot. I know David's, uh, team has been exploring that stuff a lot and, um, as always keeping the R& D strong. I'm really looking forward to, um, some of the maturity on the NERF side that I'm seeing right now. And I'm looking forward to, I think. A little bit more, you know, today we have stock footage. Tomorrow we're going to have stock 3D assets. I think I always tell people, I think we're in a 2D asset pipeline world right now and it's transitioning to 3D asset pipeline. But when people wrap their heads around what a 3D asset pipeline world is going to look like across deliverables, the idea of stock completely changes. Um, so, you know, to your point about skating, yes, we're looking into it. A lot of other people are looking into it. You know, there are going to be the people that, that's their niche and that becomes their thing and that they're already popping up and what access that's going to give to a lot of other people around the world is going to be, I think, something I'm really excited about. People at their fingertips are going to be able to be anywhere at a fidelity that right now we can only imagine, um, even as good as fidelity is today. And there's more coming. There's new, I mean, there's, there's new ways of scanning that are coming now that, uh, have a lot of promise as well. I think they're a little experimental, but, you know, we're, we're keeping up and seeing what's going to, what's going to work.

Joey Daoud:

Yeah. I mean, David, anything have been on your radar. You, you do research and run research. Laboratory. It could be about scanning or just any, any, any other, uh, photo, all of the above.

David Long:

Um, that's the beauty of having a research objective as a proper research university, as we've got faculty and students who are really intrigued across so many of the elements technically of the virtual production pipeline, just to carry on from what we're doing. Previous topic with Aaron, uh, we are playing forever with advances in photogrammetry, uh, the nurse, the Gaussian splotting, all of the different workflows that are coming out are super, super intriguing for interactivity, fidelity, um, efficiency of production, right? Efficiency of storage and, and, uh, actual deployment, right? Cause some of these models are massive, of course, lots and lots of, um, Um, opportunities in GIS and virtual scouting. You know, I, we have partners around here who come to us and, you know, they're, they're, they come from a scouting background. And like, I just wish the entire city of Rochester had a digital twin. Like, well, that may not be as far off as you think, right? You deploy enough drones and you deploy enough photogrammetry and you deploy enough of these sparse sampling, scanning technologies, you may get a reasonable enough fidelity to do that. Now, can it actually be ported in? Fidelity to be a plate for a, for a VP shot, maybe that's a few years further down, but those things are absolutely consistently on our radar. Up to other ends of the spectrum, as I was mentioning, my background is a lot more in imaging physics and color physics. And so really understanding the complexities from an aesthetic perspective of shooting VP, right? We, I, I gave allusion to it earlier, but I don't think we respect significantly enough The quandary that is using an LED emitter to all of a sudden, not just light up the pixels on the camera as environment, but also light up the things in the foreground in a way that is so distinct from proper cinema lights and does such a distinct color rendering that throws off, it throws off makeup, it throws off wardrobe, it throws off, The practical lighting that the DP has worked so hard to establish. We, to Aaron's point, we love the fact that we were able to use VP and image based lighting to get shiny things, to showcase fantastical environment scapes. Right. And it's long before Mandalorian. I mean, gravity was doing it right. That we. We love that. The problem is, the minute you start really, really paying attention and now you're immersed in that stuff, that light is not the same color quality light as a proper cinema light. And that is changing wardrobe rendition. That's changing other elements. So, we're tackling algorithms and image processing tactics that try to correct for that stuff. Uh, and this goes back to the very beginnings of color imaging, right? The minute we were able to Get away from monochromatic. Get away from black and white. We've, we've, we've done this dance between color accurate and color preferred and color embellished. And when the physics of VP intrude upon your creativity and bound you a bit, it's frustrating. And so I think you're going to see a lot of attention to that. You're seeing the panel manufacturers explore multi primary, right? So RGB is not going to be sufficient. It's, it's, Just not, you got to put in other colors as well. And that's a big image processing, uh, conundrum. It's going to put a lot of demand on that and on volume control and on data management, but it's necessary. It's necessary. Cause these are the things that are taking some DPs out of the choice to shoot VP, right? Like I, it's not just the authenticity to use Aaron's word from previous it's. I don't have the control, right? I mean, the way that wall lights up my actress's skin is not right and I, there's nothing you can do to fix it unless I can totally distance them from the wall and I can bring in a proper cinema light right on them. We gotta, we have to improve that stuff so that it truly doesn't, we don't want technical flubs to be the reason people won't shoot VP in the future, right? It needs to be for other reasons they choose not to, but for, you know, Literal limitation of the tech that to me as an engineer, I don't want to hear that. That's the frustrating part is like, well, let's tackle that. Let's we have to be able to do better than we're doing now. And some of those things that are driving you crazy.

Joey Daoud:

Yeah, I feel like color and stuff is not thought as much about. Yeah. Would you have something, Aaron?

Aaron Gordon:

Well, I was just going to say to David's point. It's really interesting to see the kind of the point that we're at because. We've reached a point of diminishing returns for phase one, right? You know, you, you, uh, as you think about kind of the camera sensor size race, right? But, uh, things are still getting played back in 1080, but people are trying to get to 8K, you know, 4K, kind of, and the beta's nice balance, but like, you know, whichever company is like, we can shoot in 12K now. And it's like, Great, like it has some advantages, but really diminishing returns when your output is, is a, you know, UHC deliverable or a 1080 deliverable, right? Especially if there's no change in the color signs. But what's really interesting right now, you know, I think about phase one of LED technology is like, we can get the pixel pitch smaller, but the color sign stays the same, you know, for cinema at least, the assumption is you're not putting someone right against the wall most of the time, right? There's a kind of established distance that you're. I'm going to bring the mouse to the ball. Even past the, just the point of, uh, moire or aliasing, it's just because it is the right depth thing to do for the, for the shot from just the actual shooting standpoint when you have layers. But, uh, again, that's really just about pixel pitch, uh, and about kind of how the diodes are in, in, in the mesh you're creating right now. But to David's point, um, with like full spectrum lighting, I do think there's a phase two that's coming, uh, and I know David has noted for a long time, which is, okay, how do you get that same pixel pitch, but all of a sudden they can output a fuller spectrum light? You know, right now we're bringing one. For whereas everyone's backlit and surrounded at least, you know, a certain amount of degrees by current panels, a lot of VPs, their solvers, they bring in the Kinoflow Mimics, um, which are RGBW and have a lot fuller spectrum, you know, ability to, to get on people's skin. On the other hand, the pixel pitch is, is much higher for those right now. And, um, The output on the nits on those aren't, you know, they're not bad, but how far away do they have to be and you know at what point do you have diminishing returns on seeing, you know, patterns on the skin and so you know there's a lot of things to keep in mind there. What's really really cool about I guess that conundrum, uh, right now is that, that is being looked at and trying to get solved at the same time that processing power is getting so much, so much better. And right now, what's really interesting is we've reached a point where one, And improving is not the sacrifice of the other necessarily, like, and you can get a stage if the pixel pitch is extremely small, the processing power has to be way, way, way, way, way higher, exponentially higher, but if the color science gets better, it doesn't mean the processing power has to get exponentially higher, and on the flip side of that, if you have more processing power, this means you're going to have a better color science, um, and so there's a lot of, and what's really cool, I guess, all to say about VP that I love, and that I think this workshop is going to cover for a lot of people and help them understand, but also Um, is, is what David has, I think, helped so many people realize, I think, throughout his career, not to, you know, hump a brag for you, David, is that, um, the best technologies are some of a lot of different parts. And so, I think when we think about the convergence of technologies that lead to, to VP, we are covering one sliver of what that is on this workshop, and I see VFX as the huge aspect of what most people are thinking about right now, um, but those same technologies are powering so many other things. that are happening in cinema and gaming and Um, advertising now in those technologies. So people getting the knowledge about this, what they may not realize is having core knowledge of this doesn't just help them for TV or film. This convergence of technologies helps them actually have a much greater understanding of where most technologies are going and, uh, an interactive display. And in interactive experiences, so it's a really translatable set of skills, which is the part that I think I love about it. So, yeah.

Joey Daoud:

Yeah, and that's a good point, too, because I mean, I think of back to, um, when I was in film school, I was the last class that shot film, and then the year after I graduated, you know, I switched to Red Ones and stuff. Uh, actually, I think, uh, David, you developed the Vision 2, the 500T?

David Long:

Yes,

Joey Daoud:

I did. Is that, okay. That was, I think that was my favorite stock because it was so forgiving of shooting. That's good.

David Long:

I appreciate that.

Joey Daoud:

Yeah. Thank you. I love that stock. But anyways, yeah. So, you know, celluloid film, not using it, but the techniques of lighting and imagery and lenses. Yeah. That doesn't change. You learn that. That still applies no matter what the medium is. Um, yeah. And, um, speaking of transferable skills and kind of talking about something that, um, Well, not really new technology, but Buzz, uh, the Vision Pro, AR, VR, kind of had a bit of a hype and then died down a little bit, but now it's kind of coming back. We have, um, Apple Immersive Video and their other stegographic format that I can't remember what they call it. How has this been on your radar as far as either production or just new types of either using it for pre pro? I don't know. Like, it's, it's a broad question of like, where, uh, do you see EverVirgin come back and be our content or just. Where is this on your radar?

David Long:

I'll answer real quick for our side. Uh, our side is probably quite a few years out with this generation of gear, but we're particularly intrigued at using all of the widgets and toys in these emerging platforms like the Apple Vision Pro to go out and to just capture tons and tons and tons of metadata from a production environment. Right? Things that are useful to post, things that are useful to color management, things that are useful to optics, things that are useful to reconstruction, what have you. I don't think your DP is going to say, all right, hold for scan and they're going to walk around with their Vision Pro and they're literally generating the 3D reconstruction of their set. That'll be when that's uttered on a film set. I'll. That'll be cool, but I don't think we're there for a bit, but there are baby steps towards that, right? And there's the user experience in that, in that platform, but really consider the benefit of the Spatial Aware, the XR Layer, the other technologies that Permit, you know, to use the term simultaneous localization and mapping, right? So you are, you're using hardware that identifies your 3D world around you while you are wearing it and using it and providing that visualization immediately. So that, there's some fun stuff there that'll be gateway technologies to the True filmmaking metaverse, I guess. I don't know. What do we want to call it when it goes to that level? Whatever

Joey Daoud:

it is. But I think, too, of like you, you know, the VAD assets and the stuff you're, you might be generating in Unreal, uh, for whatever your backdrop or something. And now it's like, okay, well, you have a world you built that could then be turned into a game or a virtual experience that you could walk around the city you built for the film.

Aaron Gordon:

That's, I think, I think you're hitting on exactly what Where my head was going before, which is, you know, uh, when we talk about like what the world looks like with a 3D asset pipeline, we are the same engine is producing augmented reality experiences, virtual reality experiences. Gaming, and now cinema experiences, right? And interactive experiences, uh, just all in between installations. Uh, digital billboards, you know, you're talking about a central point of development for all these different things. So, so I think you're, you're, uh, it's a slanted thought where your head is out of what that world looks like. We're already seeing gaming and movie IP crossover a ton now. We're seeing a lot of gaming and forming movie IP, and so I think it's only going to keep getting better. I will also say that I think that when you're talking about Visual Pro specifically versus certain other headsets, you know, kind of a mixed reality style, spatial computing, whatever buzzword you want to use of the day, um, uh, but, uh, but having the, you know, uh, not just a single layer in front, but actually having depth perception. I think that the viewership. is also going to change. I mean, I've put one on. I've experienced what it's like. You know, it's, um, it is awesome. It's really, really fun. And so there's the kind of 3D world aspect of it. I think there's also the, um, the aspect of how things are shot. You know, if you look all the way back to kind of second screen experiences where people really start to first be on the phone while they're watching something else. The idea of simultaneous interaction is, is, is really, really important to people now. So, like a really good, um, example of where that 3D and kind of cinema experience could be is it's shot a certain way to have, uh, to really have that depth kind of shoot out at you. But while that's happening, you also have a world that might have been created in a VAD. Uh, that is some Fortnite branded experience that's right next to you. Right? That is, or some kind of mini game experience that, you know, you could be doing while you're watching this thing with your friends, uh, awesomely on the side of the room. Like, that is where, I mean, that's where attention spans are going to go. So, like, either people are going to be totally in, and they're all in on it, or they're going to be doing simultaneous things, and when you think about simultaneous things, you're talking about what is the asset package that creates You know, a movie or a game, just IT in general. And I think you're going to be able to import really quickly all those types of things. The other thing that I think is really going to be one of my favorite phrases, oh gosh, what's that his name? But it's a famous futurist. He says, all real estate in our life is going to be sold twice as the physical and digital real estate. And I think we are everything kind of like the Blade Runner style of things. I do think we are entering the world where, um, as these become a thinner and more convenient You are going to get a lot of that start to translating into the physical world and it's going to become much more commonplace and those same assets are going to be used from an advertising standpoint to advertise whatever people want to advertise, wherever they want to advertise it, you know, geolocated. Yeah.

Joey Daoud:

Yeah. You know, your physical word comment that reminds me of like how Universal and now sort of Disney had tapped into, like, if we build the actual physical world from the movies like Harry Potter land or Star Wars or a Nintendo world, people are into that. It brings people in and make it interactive and physical. And then it's just sort of this digital, uh, real estate, uh, twice because then you have the virtual companion. I mean, in the quest, I played a game that tied into the, you're like exploring the Star Wars land in the game. That is the land at Disney and it's like, okay, it's a nice story tie into what you can experience at the parks and I feel like that'll probably be more crossover of that in the future as this all combines and intertwines with all of these mediums.

Aaron Gordon:

Look at Disney's, uh, uh, Epic investment. I mean, it's saying no more, right? Than, than the key, the key performance indicator of where money's being spent. 1. 5 billion at the games to create their, their interactive worlds. I mean, That's when Disney is doing that, you know, it's a, it's a thing, right?

Joey Daoud:

So, uh, yeah, you can see where it's going. Last one. And this one's also going to kind of be broad because it's such a huge topic, but AI, where is it on your radar? And I'm keeping that broad because it could be, you could take this as either AI, like image generation, Sora, or what kind of interests me more, like the practical uses of like, where I think it was called Beeple or something. I saw a demo where they're doing, using machine learning for relighting scenes, um, kind of like, More practical, immediate uses. It's a broad question, but like, where is this on your radar in a broad scope?

David Long:

Yeah, you pegged it. I'm a humanist at heart. I'm much, much more intrigued by the tools enabling human storytelling, I think, to reuse Aaron's word again, authenticity. There's intrigue, there's technical merit to AI as a origin piece for IP, but that's not at all what I'm interested in. We have spent quite a few years dabbling in AI, machine learning, other, uh, elements in that spectrum here at RIT, and it always is with an eye towards practical applications. We've had students researching, uh, tactics for performance capture cleanup, rotoscoping cleanup. Uh, we even did things, uh, back in the day using, um, large language models. to help assemble from, say, stock B roll, uh, quick editorial, say, for an evening news story, right? So a reporter comes in, they don't have a photojournalist with them, uh, they need to assemble visuals for broadcast, and they've written the copy, boom, stick it into a large language model, it's going to produce a brilliant, uh, video. bit of b roll that could be ready for broadcast immediately. So that kind of stuff we've played with because I think it increases efficiency and it adds to storytelling and communication across non fiction, fiction, the full spectrum.

Aaron Gordon:

Yeah, I couldn't agree with your point even more about being a humanist at heart. I think that everyone's excited right now. It's very exciting. I think, you know, there's not two opposing statistics that say where we are in technology and also human kind of cycle right now. Everyone's terrified they're going to lose their jobs. The latest statistic is that in next year alone, 96 million new jobs, right? So like, like, if you really think about kind of the, the, the polarity there, where my head goes is that there's going to be a major split between what we think of as the highest quality and okay quality. I think right now that gap's been separating more and more, I think the tolerance for low quality content, um, and I say low quality, I don't mean actually offensively, I really mean that authentically, like, fast, dirty, it's about the content, it's less about the production value, has obviously gotten so democratized, and it really, the tolerance setting has gotten that much bigger, But it makes the high quality to be that much better, right? It makes that gap so much tougher to achieve amazing quality and impress people because people have never been a smarter audience and you kind of have this middle gray area where I think a lot of people struggle the most and then up here you have a lot less struggle and down here, um, a lot lower. But I think that we're going to experience this incredible boon is that when we get over this moment that we're in right now, everyone's afraid. of how it's going to generate my job away. The tools that I've seen, yes, teams are going to get smaller. Yes, VFX teams are going to get much smaller, and yes, pre production is going to get uh, small in terms of means of human capital, but obviously we've seen consistently that more and more content is needed. And it means that more great content can be created instead of just a flood of okay content. And it also means that, you know, a really good example is, uh, there's actually this, um, A really great, um, software that is being built right now where, um, people can gut check. They're writing on their book by, you create all these different writer agents that are based on real writers. So obviously we, we all are sensitive to copyrighted material, but these aren't publishing things and they're, they're not actually writing your story. What they're doing is they're giving me the opinions based on the anthropology of all of their work. They're saying, Hey, in my opinion, This, this is what I might do here. This is what I might do there. And you're creating a writer's room. And that's terrifying to writers, right? But in my, my argument is it's not that terrifying because there's a backup of scripts. There needs to be more, you know, great scripts that go along with them. And it just means you're going to get to the product faster than you want to get to, and then you're going to take it to the people, and that's still going to be necessary. But, instead of them being on something for two years, they're going to be on it for a couple months, and they're going to move on to the next script that has been thought through a lot more by the time they see it. And so the output does have to get higher, but also super populating. I mean, we're growing by billions, you know, in population, our needs are going to get higher. So I think that we're just meeting the demand of where we are at. And I think the quality only can go up. Um, if you look at doing from a technical standpoint, that would never would have been possible years ago. Like what we've, what we've done now, what Operator Expansion managed to do with the VFX pipeline is managed to do. Like That team might shrink a little bit through some of these tools. They used AI tools to help the eyes color blue, but even they said in there, like, we still had to go in and fix it. It did save a lot of time though, but look at the quality of the movie overall. It was still human centric and there were some AI tools that they used, but they achieved a much greater quality than they could have in the time that they did. Um, by utilizing some of those tools. So, I, I'm, I'm a huge believer if it's just going to push the bar higher and those that don't want to push the bar higher are going to be able to create content faster and they're going to be able to kind of self proclaim themselves a creative director a lot sooner. And I say that actually with a lot of respect. I think a lot of people would doubt on that to call themselves a creative director and art director today, but the truth of the matter is with like a D to C brand or like their own branding, You know, it enables them to be an entrepreneur a lot easier, and I think it was the same Altman said, you know, uh, in, in our lifetime, probably next few years, we'll see the first billion dollar one person company, like, that's awesome. Like, that's, the people that do not want to embrace that. Are the people that are never going to build that 1 billion company and that's fine, but it does give you the opportunity to push hard. So I, I'm a huge fan to be honest about it.

Joey Daoud:

Yeah. I feel like if you tap into it or you can figure out how to take advantage of it, it can be a multiplier in just what you're able to do with limited or less resources. And like that, I mean, we kind of saw a lot of the shift too. I know AI is a bit different, but even just, um, you know, when, um, iMovie and like video editing on a computer became more accessible and then people were able to edit movies on the computer. And sure, yeah, like 99 percent of those videos are unwatchable, but it enabled the 1 percent of people to actually make something really good that they, uh, what was the, uh, Tarnation was, I think the film was edited on iMovie. 2007, 2008. And the same thing with Digital Revolution and you don't have to pay to shoot film. But, um, all right. Well, yeah, I appreciate, uh, I appreciate the time, Aaron and, uh, David, you had to hop off. Thanks a lot.

Aaron Gordon:

Dude, thanks for having me, Joey. This is awesome. Um, uh, we're really excited about this workshop and really excited to be on your show and, uh, can't wait to Can't wait to do more in the space.

Joey Daoud:

And that's it for this episode of VP Land. Links for everything we talked about are available in the show notes. Just head over to vp land. com. If you found this episode useful or interesting, share your thoughts over in the comments on YouTube and or leave a five star review in your podcast app of choice. And be sure to subscribe to the VP Land podcast wherever you listen to podcasts so you don't miss a future episode. Thanks for watching. I will catch you in the next episode.