VP Land

How Miniatures and LED Screens Brought Sci-Fi World to Life in 'Molli and Max in the Future'

April 05, 2024 New Territory Media Season 3 Episode 12
How Miniatures and LED Screens Brought Sci-Fi World to Life in 'Molli and Max in the Future'
VP Land
More Info
VP Land
How Miniatures and LED Screens Brought Sci-Fi World to Life in 'Molli and Max in the Future'
Apr 05, 2024 Season 3 Episode 12
New Territory Media

In this episode of VP Land, we dive into the world of sci-fi and indie production with Steve Dabal, co-founder of the production house The Family. Steve shares his experience working on the indie sci-fi rom-com, Molli and Max in the Future, and how the filmmakers overcame budget constraints with creative solutions.

πŸ“§ GET THE VP LAND NEWSLETTER 
Subscribe for free for the latest news and BTS insights on video creation 2-3x a week: 
https://ntm.link/vp_land

πŸ“Ί MORE VP LAND EPISODES

Inside the ZEISS CinCraft Scenario & Nano Primes
https://youtu.be/4nnkkR34WwY

This music video tested out every new piece of filmmaking gear
https://youtu.be/vFFqiyU2wUY

The CPU Powering The Creator, Pixar, and Virtual Production [AMD's James Knight]
https://youtu.be/F-4bJaOaYlc

Connect with Steve @ The Family:
Website - https://www.stevedabal.com
Instagram - https://www.instagram.com/stevewithav
Twitter - https://twitter.com/SteveDabal
LinkedIn - https://www.linkedin.com/in/stevedabal
Facebook - https://www.facebook.com/stevedabal
Vimeo - https://vimeo.com/dabal
IMDB - https://www.imdb.com/name/nm4288461

The Family
Website - https://thefamily.tv
Facebook - https://www.facebook.com/thefamilybts
YouTube - https://www.youtube.com/@the_family
Instagram - https://www.instagram.com/family

Molli and Max in the Future
Instagram - https://www.instagram.com/molliandmax/


Steve is an independent film director with a background in visual effects, working with brands (Google, Adidas, Paypal, NFL, etc.) and artists (Joey Bada$$, T-Pain, Scarlett Johansson, Oliver Tree, etc.)

As the co-founder and creative director of the Family in New York City, he has grown a small art collective into a full service production house creating digital and physical artworks. It is proudly artist owned and operated.

His work has been recognized at festivals around the world (SXSW, Coachella, Lollapalooza, Clusterfest, New York Film Festival) and he has spoken about using technology for storytelling at SIGGRAPH, Infocomm, Dior, SVA, Villa Albertine, etc.

#############

πŸ“ SHOW NOTES & SOURCES

BTS - Molli and Max in The Future
https://www.instagram.com/molliandmax/

Molli and Max in The Future Trailer
https://www.youtube.com/watch?v=VhFmVkgxveA

Oculus Rift Development Kit 2 Hands-on
https://www.youtube.com/watch?v=lzJUcdYQBj8

Tilt Brush: Painting from a new perspective
https://www.youtube.com/watch?v=TckqNdrdbgk

CNN Tech For Good BRAVEMIND PTSD VR Exposure Therapy 2020
https://www.youtube.com/watch?v=44ymmQ1kdZw&t=2s

3D virtual reality device for stroke rehabilitation
https://www.youtube.com/watch?v=LPcpn665O_s

Hello Apple Vision Pro
https://www.youtube.com/watch?v=IY4x85zqoJM&t=1s

A Guided Tour of Apple Vision Pro
https://www.youtube.com/watch?v=Vb0dG-2huJE&t=203s

Cuebric
https://www.youtube.com/watch?v=XK6HndLX7wg

The Blackmagic Camera App Tutorial - iPhone Filmmaking
https://www.youtube.com/watch?v=Z9nf7oE75TM

stYpe RedSpy
https://stype.tv

Disguise
https://www.disguise.one/en/solutions/virtual-production

#############

00:00 Intro
01:45 Steve Dabal's Background and The Family
03:42 How Molli and Max in the Future Started
07:10 VR and Storytelling Research
10:45 Learnings from Early VR Experiments
13:20 About Molli and Max in the Future
15:00 Prep and Production Process Timeline
21:00 Generating Plates and 3D Models
26:32 Color LUT Process for Backgrounds
28:28 Stage Hardware and Tracking
32:15 Tips for Indie Virtual Production
37:30 How Molli and Max Would Differ Today
39:53 Photogrammetry and NeRFs
43:08 Mobile Virtual Production Projects
48:55 AI as an Alternative to Virt

Show Notes Transcript

In this episode of VP Land, we dive into the world of sci-fi and indie production with Steve Dabal, co-founder of the production house The Family. Steve shares his experience working on the indie sci-fi rom-com, Molli and Max in the Future, and how the filmmakers overcame budget constraints with creative solutions.

πŸ“§ GET THE VP LAND NEWSLETTER 
Subscribe for free for the latest news and BTS insights on video creation 2-3x a week: 
https://ntm.link/vp_land

πŸ“Ί MORE VP LAND EPISODES

Inside the ZEISS CinCraft Scenario & Nano Primes
https://youtu.be/4nnkkR34WwY

This music video tested out every new piece of filmmaking gear
https://youtu.be/vFFqiyU2wUY

The CPU Powering The Creator, Pixar, and Virtual Production [AMD's James Knight]
https://youtu.be/F-4bJaOaYlc

Connect with Steve @ The Family:
Website - https://www.stevedabal.com
Instagram - https://www.instagram.com/stevewithav
Twitter - https://twitter.com/SteveDabal
LinkedIn - https://www.linkedin.com/in/stevedabal
Facebook - https://www.facebook.com/stevedabal
Vimeo - https://vimeo.com/dabal
IMDB - https://www.imdb.com/name/nm4288461

The Family
Website - https://thefamily.tv
Facebook - https://www.facebook.com/thefamilybts
YouTube - https://www.youtube.com/@the_family
Instagram - https://www.instagram.com/family

Molli and Max in the Future
Instagram - https://www.instagram.com/molliandmax/


Steve is an independent film director with a background in visual effects, working with brands (Google, Adidas, Paypal, NFL, etc.) and artists (Joey Bada$$, T-Pain, Scarlett Johansson, Oliver Tree, etc.)

As the co-founder and creative director of the Family in New York City, he has grown a small art collective into a full service production house creating digital and physical artworks. It is proudly artist owned and operated.

His work has been recognized at festivals around the world (SXSW, Coachella, Lollapalooza, Clusterfest, New York Film Festival) and he has spoken about using technology for storytelling at SIGGRAPH, Infocomm, Dior, SVA, Villa Albertine, etc.

#############

πŸ“ SHOW NOTES & SOURCES

BTS - Molli and Max in The Future
https://www.instagram.com/molliandmax/

Molli and Max in The Future Trailer
https://www.youtube.com/watch?v=VhFmVkgxveA

Oculus Rift Development Kit 2 Hands-on
https://www.youtube.com/watch?v=lzJUcdYQBj8

Tilt Brush: Painting from a new perspective
https://www.youtube.com/watch?v=TckqNdrdbgk

CNN Tech For Good BRAVEMIND PTSD VR Exposure Therapy 2020
https://www.youtube.com/watch?v=44ymmQ1kdZw&t=2s

3D virtual reality device for stroke rehabilitation
https://www.youtube.com/watch?v=LPcpn665O_s

Hello Apple Vision Pro
https://www.youtube.com/watch?v=IY4x85zqoJM&t=1s

A Guided Tour of Apple Vision Pro
https://www.youtube.com/watch?v=Vb0dG-2huJE&t=203s

Cuebric
https://www.youtube.com/watch?v=XK6HndLX7wg

The Blackmagic Camera App Tutorial - iPhone Filmmaking
https://www.youtube.com/watch?v=Z9nf7oE75TM

stYpe RedSpy
https://stype.tv

Disguise
https://www.disguise.one/en/solutions/virtual-production

#############

00:00 Intro
01:45 Steve Dabal's Background and The Family
03:42 How Molli and Max in the Future Started
07:10 VR and Storytelling Research
10:45 Learnings from Early VR Experiments
13:20 About Molli and Max in the Future
15:00 Prep and Production Process Timeline
21:00 Generating Plates and 3D Models
26:32 Color LUT Process for Backgrounds
28:28 Stage Hardware and Tracking
32:15 Tips for Indie Virtual Production
37:30 How Molli and Max Would Differ Today
39:53 Photogrammetry and NeRFs
43:08 Mobile Virtual Production Projects
48:55 AI as an Alternative to Virt

Michael, who wanted to make a movie, an expensive movie, all on LED screen, and he couldn't find the money. He didn't have the resources. So he said, okay, I'm just going to write a lower budget movie and just do it all ourselves. Welcome to VP Land, a podcast where we dive inside the tools, projects of people that are changing the way we are making movies. I'm Joey Daoud, your host. In this episode, we dive into the world of sci-fi and indie production with Steve Dabal, co-founder of the production house, The Family. Steve shares his experience working on the indie sci-fi rom-com, Mollie and Max in the Future, and how the filmmakers overcame budget constraints with creative solutions. I feel like if anybody tells you, you need a certain technology to do your movie, you have to look them in the face and tell them they are completely wrong. Because there is like no world where that's the truth. We discuss the benefits of virtual production, even on a tight budget, and how it enhances the experience of the cast and crew. Number one thing is always like the actors can see what's going on, and the crew can see what's going on. That will forever be cool about something like in-camera visual effects. I mean, even like puppetry and all that, like it's not a tennis ball. Like these things help the workflow on set. Steve also shares his insights on the evolving landscape of virtual production from his early experiments with VR and photogrammetry to the potential impact of AI on indie filmmaking. I think the AI side is super interesting to be able to generate pieces that usually wouldn't be possible. I think everything AI-generated has a feel to it. Some people hate it, but it'll change and mature. Links for everything we talked about are available in the YouTube description or in the show notes. And be sure to subscribe to the VP Land newsletter to stay ahead of the latest tech changing the way we're making movies. Just go to vp-land.com. And now let's dive into the fascinating world of Mollie and Max and virtual production with Steve Dabal. Well, Steve, thanks for joining. I appreciate you having you here. So, yeah, we talk about a lot about Mollie and Max in this fun sci-fi movie. But first off, you want to give me a little bit about your background and The Family, your production company. Yeah, absolutely. And, Joey, thank you so much for having me. Like, truly, everything on the channel is incredible to watch and see. I've been to the bottom of Google searches trying to figure out technology. The fact that you're out here telling people about this stuff is so special. Yeah, trying to be that video when you're looking for that deep search where you're like, the answer is you find it of that really weird technical thing. Which like my background is I've done a lot of virtual reality, and I've been in the augmented reality world, but like back during the first Oculus coming out. And like you would Google stuff, and it would literally be no results. And so the fact that these things are starting to appear is something really cool to see. I started doing visual effects in- I went to school in California, Art Center College of Design, which is a very heavy design school, so people are doing concept art, they're doing transportation design. We had a small visual effects program, which was doing Flame, which not a lot of people are using anymore. But again, it was this big, bulky program that had kind of a gatekeeping amount of knowledge. But it meant you got paid really well because there were very few people who were on the Flame box at the time. So I was able to get my feet wet in visual effects. And then of course started transitioning to After Effects and started realizing, oh, there's these other tools, which led to Unity and Unreal Engine. And then, I mean, we're getting to the point now where I don't know what program I'm using because every project is different. But that kind of has been this interesting journey of realizing you can use technology for storytelling, which has brought us to, of course, virtual production, which my company, The Family, was started in 2015. We were just a bunch of kids said, hey, this technology can actually be kind of useful for stuff. And we've kind of done every project under the sun. Nice, nice. And how did Mollie and Max come on your radar? And was that first big virtual production project? Or what was your sort of experience with virtual production before that? So virtual production started for us doing virtual reality. It started during school. I was doing a little bit of exploration with neuroscientists about how virtual reality can affect storytelling. She was working with a team of people that were doing, to me, the coolest thing I'd ever seen where they would put sensors on people's heads and have them watch an episode of Seinfeld and then have them watch a Hitchcock movie. And they would see how the brain would completely light up once there was something like a Hitchcock movie. I love Seinfeld, but to me, I was like, wow, that's really crazy. Like physically, we're altering people when we make movies and they started introducing virtual reality when I started doing this independent study to see, hey, when someone plays a video game or someone watches something when now you're using their vision without the screen having distance, it was really triggering the brain to a whole new level. So we started kind of making things in VR. Almost all of them made people sick. It was like the systems were so slow. We would have like the systems were so slow. We would have like a Oculus DK2 plugged into a Mac. And like it was just really bad, and everyone didn't like it. But for some reason, we just thought there was something there. And as we started getting more and more into it, we met someone at HTC Vive, which had this program called Tilt Brush, where you could draw. And the fact that you could draw in 3D space was, again, such a crazy concept that you could draw and communicate in a way that I had never experienced before. And so we did a commercial for them where we put one of the controllers for the drawing game on an ALEXA camera and used it for tracking data. And again, it didn't really work. Luckily, no one was getting sick that time, but it was just a pain in the ass trying to figure out how to make this thing there. But it was like, oh, there's something to using tracking data with a camera. And then cut to a couple of years later, the pandemic happens and we say, oh, we can't travel. This thing is starting to get a little popular about doing LED screens. Broadcast is really doing this tracking thing well. We had some experience doing VR. We said, okay, let's figure it out. And so we spent the pandemic building up a stage, going to the bottom of Google, trying to figure stuff out. I got to work with some incredible people on the technology side. We had an integrations team, this company, Choerogrfx, that were in the city that came over and helped us in Brooklyn. We had a bunch of producers in the film world who were just bringing us people to help figure out how do we do DMX lighting? How do we install an LED screen? How do we do this and that? And then it kind of just started compounding of a lot of talented people putting a stage together. And one of those people was Michael, who wanted to make a movie, an expensive movie, all on LED screen, and he couldn't find the money. He didn't have the resources. So he said, OK, I'm just going to write a lower-budget movie and just do it all ourselves. So as we were building the stage, he was coming by. He was saying, hey, I filmed this miniature in my living room. Can I put it on the LED screen and do a camera test? You do a camera test. He said, hey, I got a DP. He's starting to do stuff. Can he come by with me? They'd come by, do another test. And then by the time the stage was up and running, he said, hey, I have the movie written. I have a lot of the backgrounds created. I want to shoot this movie. And then, yeah, we ended up shooting it on the stage. Yeah, that's awesome. I'm curious about going back a little bit with your story, the study with the brain lighting up in the Hitchcock film versus Seinfeld. So you're saying just because the Hitchcock film was more engaging and more of an engaging story that was making the brain light up even more? Yeah, exactly. It's like using these things that were taught in film school of suspense, using things like romance, like they actually do interact with your brain. When the virtual reality stuff came on, the part that convinced me was if you were watching, there's a test, I think it's by, it's either like Skip Rizzo or someone in the USC side that was doing a test where you could put on a headset and you could see a photo realistic version of you in the room. So it's like you're wearing the headset and there's like a Joey in the room. Yeah, so Joey's on the table, and you're like, oh, that's me on the table. And then they would have someone with a knife come in the room and stab the Joey on the table. And your brain would go, oh my god, I got stabbed. That's horrible. And the brain would go, oh, like, that's bad. But then the guy with the knife would turn to your POV and come up to your POV and then stab like under you because you're in VR now. And it breaks that fourth wall as we know it. And the brain fights or flight total like fear because you in your point of view just got stabbed. And I was like, that is so powerful. Yeah, that also sounds like a prison, like sci-fi prison storyline or something from Star Trek. Like when we go to like these- we were going to these VR, uh, festivals and stuff, they're all horror things. And I was like, of course they're horror. Cause like it truly is triggering your fight or flight. It sounds like that video is going around a few months ago or a year ago. Someone's hand was on the table and then there was like a dummy hand right next to the hand with a divider and they knew it was a dummy hand, but, and then the experimenter was like hammering the dummy hand and they reacted, you know, as if it was their hand. Yep. Yeah. And like, that's the scary part of what happens when people use this technology. And so to see this education side, they were using that like, oh, how do we help people? So like, there's a lot of people who are doing studies in PTSD. How do we use virtual reality for exposure therapy? There's a lot of people who were doing, oh, someone can't move their fingers. Let's do that exact test you mentioned. But they would do it where the fingers in the virtual reality game move based on their brain sending signals and it helped them actually learn to move their fingers again. It's like it's so amazing, there's so many positive ways to use this, but there's definitely a lot of negative ways as well. Given all this tangent, because with Apple vision pro and kind of this second or whatever, emerging new phase of VR being back on the radar again, do you feel like anything that you learned and experimented with almost 10 years ago is now coming back into play or you feel like now that the technology is better, like latency is lower almost real time, that a lot of the motion stuff is sort of solved depending on what the content is. But do you feel like there's learnings and stuff from back then that you can apply to like the new ways of building content for the Vision Pro or immersive reality? Yeah, definitely. And that's such a good question because I think it proves how things are in a cycle that we were doing things that were like, I don't know if you ever got into 360 video. Yeah, briefly. It was so dumb, but we were all really into it for some reason. And I think there was a cool potential to it. I don't know what it necessarily accomplished. I mean, I think something to it. It's interesting. Yeah, it always felt like, where do I look like? I mean, am I missing something? Am I looking in the right spot? And that's interesting, because with Apple Vision, they have the the stereoscopic videos you can shoot yourself, but then they also have the Apple immersive video, which was like content they produced. And when I was looking at the specs of that, it was a 180 view. So it was limited and it's like, oh, they seem to have figured out like, we do need some focus of like where we should be looking and not completely surround us because it's like, where do I look? Yeah, which that I think was the most challenging thing that I had to learn then that I apply now is you have to give up control as a filmmaker when you're using this technology because you can't curate everything. In film, we can make sure your eye looks here, what the frame is, how it's cut, when you leave it. All these things are based on control. When I've worked with either immersive artists in immersive theater or technology, it's about what is the audience gonna do? What are the options they can take? So you almost have to think like a video game designer more than a filmmaker. And you have to think like someone in theater more than someone in film. So it's just having that brain that you can't control the audience that, at least for me, it was very challenging because I've been raised that every aspect of the frame is what I control. Yeah. Yeah. And the frame is pretty much gone with this. It's useless. But that's cool. It's like you can curate people. It's like I worked with a video game designer and he was on the he worked on the first Call of Duty game. And he said the most crazy thing to them was how to design arrows for signs because he would have a sign with an arrow to tell the player to go down that hallway. And he was like, depending on the way that arrow looked, sometimes everyone would go the opposite way. They would just be like, I don't trust that arrow, and they would go the other way. And then he was like, sometimes you'd have to like write a little word of like battlefield and then an arrow. And then you'd get like 50% of the people would go that way and the rest would go the other way. So he had to learn what is the way that you can move a player around a space because when you tell them what to do, I mean, it's humans, we're probably gonna do the opposite. Yeah, and kind of guiding people to where you want them to go, what you want them to see without having the control of just a narrow frame and the video is moving, Which is so different. Whether they want it or not, yeah, different experience. Okay, so going back to Mollie and Max, do you want to give me kind of the overview description of kind of a fun sci-fi like When Harry met Sally? But yeah, do you want to kind of describe the movie and then also just a lot of like the visuals and the scenes that we see and where it takes place? Definitely. Yeah, Mollie and Max is- so Michael Litwak came to us with this idea for a movie that was absolutely ridiculous. And he sent us the script and it was very funny, but absolutely ridiculous. And we said, I have no idea how someone's gonna pay for this, because this is the type of movie that I love. And you find it a midnight screening at a film festival because it's so creative, but it like is so hard to convince someone to make it. And he said, I'm just gonna make it. So he just shot it all in his like miniatures in his living room, bootstrapped everything, like truly labor of love, because the concept is it's a sci-fi, it's a rom-com that is baked as a sci-fi. So essentially you're taking When Harry Met Sally and throwing it in space. And it has cults, it has absurd creatures. It's such an over-the-top, fun movie, but elevating the rom-com genre into the sci-fi world. And to do that, it was about how do you make wacky environments? How do you do like this old school style of rom-com and comedy, but throw it into this out of this world, bizarre environment. Yeah, there's a lot of different planets, virtual like tennis or pickleball or something, bars, different bars, inside a car or inside a space car or whatever driving. Yeah, so a lot of environments. What was the, let's just get a review of the prep process and then to the production process. Just like a timeline so we can kind of figure out. Also, like when the movie came out last year, I'm assuming production was like 2022, 2021. Yep. We're talking in beginning of 2024. Just because the technology is changing so rapidly, just to kind of peg where tech was when this was filmed. And we can compare where tech is now. But the prep process and then what did the actual production process look like? Yeah, I was just thinking, as you were saying that, I don't know if things would have been done the same if we had the technology now, which is true. Yeah, that's another question I got. So it started in 2020. We first started talking to Michael who was working on this movie and they pretty much did a year and a half of prep. And that year and a half was usually the period that filmmakers are just pitching their movie a million times, making a bunch of iterations of a deck, just going nonstop. He actually just started making it. So in that year and a half, Michael, the director and Zach, the DP, both of them were doing visual effects. And so they were shooting miniatures and then comping them together in After Effects to make plates, to make backgrounds, to kind of take their storyboard of a movie and just build up all the backgrounds and all the pieces for it. I had never seen something done this way before. And it truly was so inspiring for me. And to think that there's a version of filmmaking where during the period of pitching, you can just start making the thing using free software. So by July 2022, they were ready to shoot and we did one week on the LED screen and then the rest of it they did on a rear projection stage and then a couple on a green screen. So it was like an 18-day total production. And essentially, five weeks later it was picture locked. So September, 2022 and then it premiered at South by Southwest after that. I think he delivered the DCP like a couple days before it was due for South by Southwest. Was that just from all the extra of the effects stuff after picture locked to just yeah, finalize everything? Just filing stuff. The most of this LED screen work and the reproduction work was in-camera. That's like the beauty of being a movie that you can actually commit to in-camera versus I don't believe any of these big productions that say they're doing true in-camera final pixel just by nature of the pixel pushing of a lot of these ways of working. They did have to do some touch-up work on some set extension. So some of the shots, the stages we were in were limiting because it was a lower budget production, so they had to kind of comp in some extra work for that. And then the green screen stuff, of course, was the bulk of that post work, which again is just like two dudes busting their ass, pulling off all these visual effects. So it's over 900 shots that two guys are pulling off. It's out of control. Yeah, when I was watching I was like, is there a shot in here that's not a good VFX shot? Yeah. I mean the LED screen stuff is like this is the crazy line that like, they technically did the VFX ahead of time and then brought it to us to prep to then put on the screen. So even though they didn't have to do it in post, they kind of did that work ahead of time. Right. And so how were the plates generated and how did the models come into play with the backgrounds or the environments? So the most of its miniatures, I would say the majority of its miniatures, which was a mix between Michael's living room and an office space that they got and they were capturing it on a like an a7S or a7R, just grabbing different plates of miniatures and then comping them and then a bunch of packs to just add on top of that. So just building out this look. This is like the cool thing to me, too, in that year and a half, it has such a distinct look. Whether you like it or not, it's impossible to ignore that this film has a world in the look. So it kind of helps that low budget thing because it just commits to the look. So even when stuff's campy or not perfect, you're just, that's the look. Like it's really hard to be like, this scene looked worse than the other because it just has such a world to it. So they really found that look in that year and a half. So that once it was either on the LED screen or it was going to be something for green screen, it was all kind of together for the feel. There was only two scenes that are Unreal Engine with the parallax and with all that. And those were photogrammetry of the miniatures to then throw in the scene just for consistency sake. So her ship and the male lead's dad's house from their world, those two objects were 3D. And then those are in some scenes with a little bit of parallax in Unreal. OK, but you're saying they still- they built them as a miniature, and then they were scanned and brought into Unreal to turn into a 3D model so you can have some camera movement in parallax? Yeah, exactly. So everything was rooted on physical miniatures. So pretty much every scene, every plate was built on a miniature. Pretty much. Yeah, it's pretty awesome. It's it's such a crazy way. It's even like all the houses, all the pieces are either 3D printed or bought miniatures that they like kit bash together. Even the ship, it's literally they like jammed a car with of this and that and then built out this weird miniature ship. But like it looks great because it has that look and feel of being handmade. And that's like the beauty of even when you're using that top-of-the-line technology, we're using stupid, expensive computers and LED screens and all that. The look and feel was based on miniatures. It was based on Michael and Zach's like commitment to having that look and feel to it. And so most of the plates, and I think this was mentioned in the article from Blackmagic, were just 2D, like static images on the on the LED wall? Yeah. Yeah, they did a lot of the 2.5D in After Effects where it's like a ship and a foreground object and just kind of moving between those two. So that was a lot of those generated plates. So all the car scenes, even the like, yeah, space stuff is 2.5D plates. So they built a couple and during testing in that period would come, we'd set up a camera, they would try it, see if it matches the look and feel, come back another day with some more plates. I think we did that for motion. We definitely did it for color. Make sure all the color signs felt good. And then it was just doing it to to say, hey, does it feel weird that there is or isn't enough motion for something? OK. And yeah, can you tell me about with the color? Can you tell me more about the process you use for the backgrounds? Yeah. So the plates were all done and still to then brought into After Effects to then get us to motion. We did a couple versions where we would put the raw image on the screen and then try and apply a LUT to the video feed. But then you're using a raw video feed on a raw image on the LED screen. And that ended up being too complicated for us to make sure the colorist would be able to match it. So we ended up doing what you would do for a final pixel process where we had a LUT that was set up on the footage. Essentially, we had the footage and we had a LUT on our server. And we could control the intensity of that LUT on the screen and then through the camera had a similar LUT on the camera feed and then could adjust the background LUT to kind of bring it up and down depending on how it would fit. So it's not what I would recommend for most movies that are doing a traditional color process. But because Zach was able to have the authority to look at monitor and say, hey, this is good, it could kind of be locked and be that. And then of course, the colorist got everything to deal with later. But it was really honing in on like making sure on the day he, him and Michael were good with the look. So it's definitely a very weird process. And again, it's hard to say, cause if you were on a bigger production, you'd have a larger post team to do this with, but they were really aware of it's just the two of them having to do everything. So what's the easiest way to just get to a good end product. In-camera that they could tweak in color later, but that's in a good spot for them to start with. Yeah, and the beauty was it looked great in camera. So even the editor was able to push out a cut really quick and get it to something like South by Southwest, though like the first 20 minutes of the movie because they were on the LED screen, it's good. Like you can see it, you can feel it. It's not like the rough cut submission of a green screen with janky things. And so that was the- Text on it like, oh, all this thing will be here. This will be good later. Like, that's not a part of it anymore. So it was helpful for them, I know, to see it, have it there. I mean, the number one thing is always like, the actors can see what's going on and the crew can see what's going on. That will forever be cool about something like in-camera visual effects, whether that's- I mean, even like puppetry and all that, like it's not a tennis ball. Like these things help the workflow on set. But as far as delivering for post-production, this type of movie was allowed to really commit to, okay, on set through a monitor, do we feel good about this? Because that's like the look. What was the, what was the rear projection used for and what made the decision of like, when to go with the LED wall and when to go with the rear projection? It was mostly a budgetary thing. We were doing the LED screen side. There's another company in Brooklyn that was doing the rear projection side. We tried to do more on the LED screen, but we were about to take it down. So we actually, this was our last project on that standing stage. Now we do a lot of mobile setup. So, Hey, the production wants to do it in a different place. We go there, pop up the stage and do it that way. First at the time we had two years where it was a standing stage in Brooklyn, and Mollie and Max was our last thing. So we literally were like, after this date, the stage is coming down. So I think it would have been more that way, but a lot of the scenes, like when they put on the helmets and go through their past, like those really beautiful sunsets, that was all in the rear projection. And it has a really soft, beautiful look to it. I think Zach is incredible as a cinematographer and for visual effects, for being able to blend the fact that it's really hard to tell what's LED, what's rare projection, and what's green screen, because they all kind of served a purpose. So all the scenes were obviously contained. It's not like halfway through a scene, it jumps to a different technique. But each scene he was able to say, well, this one is silhouettes, so it needs this technology. This one has a bunch of reflections, so it needs this technology. Yeah, yeah, definitely each scene had like distinct looks with different location or the planet or whatever place it took, or wherever it was taking place. So yeah, I had that continuity and funky aesthetic, but consistent looks in the scene. Yeah. And yeah, what were some of the other workflow processes? I know you used an ATEM Mini, so where did that come into play? And I mean, you used a bunch of other Blackmagic hardware, so you can talk about all of the hardware and signal flow and stuff. Yeah, I don't know. Have you worked on LED screen much? Do you know like the- No, me personally no. I mean, but you- Other than talking about- Give us the explainer, yeah. Cool. I mean, it's so many bits and pieces and stuff that when we first started looking at LED screens, we would go to some and they would be like, oh, it looks great at 30 frames per second, not really 24. And we're like, well, then this is useless to us. We're like, we'd go to other ones. They'd be like, oh, you have to do this thing this way. So it was quite a process to make it modular enough that different types of productions could be done on LED. I think the world of Mandalorian with unlimited amount of money makes it super easy to see how anything is possible. But when you don't have that, you have to be- you have to make some choices. And we didn't really want to do that because we were like, it depends on the movie. You could be like, I want to come film my anamorphic Western on your stage. And someone else is like, I'm doing my sci-fi rom-com. I'm pretty sure you guys are going to have different needs for the look and feel of those movies. So when we built it, we had a lot of options for how do we make sure we can accommodate for any type of production. For Mollie and Max, they were pretty open to matching whatever we recommended. This is probably changed by now. But at the time you could only really use a global shutter camera on the LED walls because a lot of the hardware was from touring so it couldn't really match the cameras. I've talked to a couple camera manufacturers that are now aware of this and they have the ability for you to go even lower in your shutter speed to not deal with this, but at the time there wasn't an option. So they used the KOMODO for their production so that we had no issues with that. That was a big part of it. There was a lot of moments that we were coming in and out of the pandemic so we had to do a lot of live stream stuff. So we were able to feed that KOMODO because it was on the standing stage into the little ATEM Mini to then send to like a ZOOM feed. So if Michael was in a wardrobe tests, he could just pop into a test Zach is doing and make sure it feels good and looks good. Again, using that ability of having something like an LED screen to make collaboration a little more real time. There was things like that that we kept doing to say, hey, how can we make this as easy as possible for them to see it and get to that final look, which was just a lot of back and forth of working remote, having them come in, doing tests. How does Unreal look and feel? How do the miniatures look and feel? Yeah, so you're basically just using the ATEM as a webcam encoder to send the KOMODO feed to over ZOOM. Yeah, it was definitely a weird- This was before they started- they brought out their dedicated. Exactly. They used to have the weird web presenter that was such a weird device. I had one. Yeah. It was a very weird device. And then they finally released dedicated web encoders that would make a lot more sense. But this was- I think this is before they introduced that. So the only way you could easily encode was to get this mixer and where you're just literally using it to plug into your computer. Yeah, which was so funny because it'd be like a cinema camera feeding into a media server and then the media server doing some processing and then feeding out. And then that going into there to then stream to like a ZOOM feed is just like such a crazy, like the amount of feeds coming in and out on an LED stage is so crazy because sometimes if you're doing set extension, that's its own output. And then sometimes you want the UI to actually go through. Because when you're mapping stuff sometimes on the floor or remote, you need to see things. There's just so many feeds coming in and out that it just becomes this very complicated thing. But at the end of the day, it looks so simple. It looks so- it literally just looks like you're pointing a camera at a thing. You're like, oh, it's so easy. What hardware is your stage running on? And I don't know if it's changed since Mollie and Max to today. But yeah, what hardware is the stage and you're tracking when you do track? Everything's running through Disguise. That was a big thing for us to commit to a scalable solution. We definitely tested with doing smaller machines and doing some broadcast systems. And it was always scary. There was just always the option that it could break. And so on a movie, it's just like, you can't really mess with that. So it's always been disguised for hardware. And then our tracking solution changes per project. So using stYpe, which is a on-camera mount that goes up to tracking markers on the ceiling. And then we also have an Ncam that we were using for if you're inside of something and you can't track to the ceiling, then the end cam can pick up as a more localized tracking system. Like inside the car. Like inside the car, yeah. Or if you're like, say there's a little set and then that has a ceiling, then we can't reach the tracking on the ceiling, then that would be like an Ncam situation. We honestly didn't mess with too many tracking just because for this one just for this one- Yeah, because it was 2.5D plates or yeah. Pretty set up was really only those two big scenes that had the full Unreal build that needed stYpe. And yeah would that be- kind of going on the route of just like tips for virtual production or indie films or indie projects trying to do virtual production is that sort of a good viable option where it's like you don't always have to have the full tracking like figuring out your shots and you could get away with doing something that involves less hardware, less tracking. Definitely. I feel like if anybody tells you you need a certain technology to do your movie, you have to look them in the face and tell them they are completely wrong because there is like no world where that's the truth. Even for like the rear projection stuff, we're using stYpe, we're using tracking markers. Zach built his own little tracking setup where he had like a camera mounted on top of the cinema camera to just like the ceiling. So then in After Effects, he could track and then reverse it. Like doing the- okay. I've never heard of someone doing that, but it was because he saw the stYpe and was like, Oh, that's how that works. Let me just build my own version. I think that- And do a little bit of post processing, but it was a little bit of the same position tracking or something on a marker on the ceiling. Yeah. It's incredible. Like it's just that is a DIY way of doing something really expensive that is just being really creative about that. However now I'm like the amount of AI that you have access to the amount of tools that are coming out. We're in that period where stuff is still free. Unreal is trying to leave that segment. So these things are slowly changing for some of it, but all these brand new companies, it's all still free. So I think that's the time you have to jump on and use these tools because a movie like this, when we were doing it, I truly was like 10 years ago, this would not be an option. It would be all green screen and these two people would have burnt themselves out and this movie never would have gotten released because the amount of work it would have taken to do. Of just shooting everything green screen and then the two of them trying to figure out how to come composite every single shot. Yeah, like a thousand shots would be just disgusting to get created. It's so- and like I just don't- it would have been inconsistent. It wouldn't feel like I think them spending that time in pre-production to find a look and feel so that it can be justifiably low budget is the unique component that having free technology allows you to do because a year and a half of work didn't cost a lot of money verse every day of production costs a lot of money. So I think that's the big thing. And you're able to get that done ahead of time. It's the same thing an earlier podcast I did with filmmakers behind High, which we're still trying to get off the ground. But it's virtual production about tower climbers. And a thing they kept saying was the nice thing about virtual production is we raise a little bit of money. And the money we're able to spend on Unreal Design or a virtual art department to pitch the concept. That's money they're spending to get these resources. But they can keep them. And it's like it builds on to when they do finally go into production, virtual environments and stuff that they already spent and created that they can use later on. And it's not like- it doesn't have to be like we need all the money at once to go into production. We could kind of like get it step by step and still progress further in making the film a reality versus like if- you said 10 years ago, like you have to film everything on a green screen and then tear your hair out trying to like- Everything to look good. I really loved your episode on High because it matched how that technology supported that story because that would be a really dangerous production to do physically out in the world. Right, another thing where- yeah, not sci-fi and yeah, but like a reason why virtual production would work well for something contemporary, but in a very tricky to film situation. Yep. And that's where Mollie and Max was really cool because it was a romcom. And he was like, I want to film a romcom, but I want to trick people into thinking they're watching a sci-fi. And that's where I'm starting to see, oh, here's that maturity that's starting to come with this technology that we're not just doing the cool Star Wars influence things. We're starting to say, oh, well, what about this rest of the world of movies? What about those dramas? What about those rom-coms? Can we use technology to elevate them? Because when I went to school, they were like, do not make a drama and do not make a comedy. You will never make money doing these. I think it's fair because it's like, it's expensive to make them look good. If it's two people in a room as a drama or as a comedy, no one's going to buy it. And if my script has explosions and stuff, no one's going to pay for it. Now we're in an interesting world where, yeah, you can have people like up in the air or in space or doing things that are usually really expensive. Now they're not so expensive, so I feel like that opens up a lot of storytelling. Yeah, yeah, for sure. Going back to what we were talking about before of where technology is today, so if this was happening now, what would you do differently or what tools are out now? Yeah, just anything that would be you do differently or that would be easier, faster, cheaper to do. It's so hard to say what we would do now differently. It's just like every week, it feels like there's a new tool that would be helpful for making movies. I don't know if a lot of them are, or if it's just marketing for it. I think the AI side is super interesting to be able to generate pieces that usually wouldn't be possible. I was wondering, like something like Cuebric, where a lot of the demos are sci-fi worlds. Exactly. It's like you can build out your 2.5D scene with some parallax. But it would be interesting for this type of example, because I don't know, like from a production side and from a stage side and from just executing the movie, something like Cuebric would be perfect because it would be really fast. It'd be 2.5D. We could include tracking. A lot of our scenes, we were like, we can't do tracking just because we don't have the time to make it all 3D and do all that. That would mean some of those scenes that we couldn't move the camera before could now move the camera. And for even some of the scenes that were 2D, Zach was like, I'm okay moving the camera as long as there's a physical object in the foreground. So we get the parallax off of the real world and the background just won't have parallax, and that's okay. I think that would have been helpful for him to be like, well, actually now we can add parallax, but I don't know if it would match their look. So it's interesting to even think about the cubic stuff does have a feel to it. I think everything AI generated has a feel to it. Right now, yeah. Some people hate it, but it'll change and mature. So I don't know if it's there yet for someone like this who like the look is so defined. I'm not sure if the AI generated look right now would match their style, but like I could eat my words in a year because there'll be 50 options for style type, and you could feed in miniatures, and then- Give us your mood board, and we will create the perfect environment based on your mood board that you use AI to generate the mood board with anyways. Which like, I know people who are building like data sets for companies so that the generated imagery looks like the images for that company. Like based on their existing material in this company, like their own little models. Yeah, I mean, I think the smaller models and the more customized models on especially people that already have a lot of IP, probably something we'll see more of in the future. Yeah. I don't know, it's like, because it's, miniatures are imperfect. And it's interesting to think they can do that. I mean, that's the interesting thing, hearing that I never, yeah, didn't realize that they were all miniatures. And so that also just brings a little bit, it's just like an interesting blend where it's like, we're talking about like the latest tech and stuff, but then also merging these like very handcrafted, old-school techniques, which just brings this charm to the whole process that, yeah, obviously you're not getting with AI, but I'm thinking from more of a budget perspective, or also if someone is not a skilled miniature, creative artist, like the filmmakers were, what their options are to bring their indie sci-fi film to reality. Yep, which I know Michael doesn't have any background in 3D, so he was like, well, miniatures, I can figure out. And that was his solution to do that world building. I think to your point, it's interesting for that next generation of filmmaker to think what are either the software or the techniques or the place I'm most comfortable. If it's drawing, draw and then generate stuff off that. If it's using your hands, like do some clay and some stop motion, like that's where nerfs and photogrammetry and a lot of these abilities to take something from the physical world into the digital. That's where a company like us has been really lucky, because now instead of someone coming to us with a deck and a script, they could come to us and say, hey, here's my background, here's my look, here's my feel, here's most of my movie, and then we can just help execute that vision. But it's really hard when you feel stuck that you can't communicate what your vision is, and it's in like PDF form, or it's in like a- it's just in the script because I can't see what your movie is supposed to look like. If you can find that way, starting to communicate to people, here's the look and feel like now we're really cooking. Yeah. I remember there was an interview with Gareth Edwards, I think on Noah Kadner's VP podcast. And he said something with AI where he's just like, yeah, if there's an easier way where I can just get this stuff out of my head and like get it on screen or just get it to communicate with people like I am all for that. Yeah. And using AI to do that, like, awesome, great because I can jsut get our of my head and communicate it more easily. You mentioned the photogrammetry and NeRFs. Have you been using that outside of Mollie and Max. But like, other stuff that you're doing, have you been using that more often for like scenes and stuff or for background generation or background capturing? That was one of things in the early days of virtual reality that we got really into was photogrammetry because it was just the highest fidelity. So we could capture a place like we did an installation and then did a full LIDAR scan with some photogrammetry to then recreate it in VR so people could walk around it we could showcase it. And it was another one of those things. It was so cool but we just didn't know how to either market it or get it to people because it essentially meant like me on the subway with like a Vive and some trackers going to a place and like popping it up and then showing them. So it wasn't necessarily sustainable, nor do I even know the root of why it was interesting, but there was something to that. And that has kind of evolved now into something like Mollie and Max. The dream version of Michael's movie is everything is a miniature, and then using photogrammetry it's all captured, and then now that's every generated thing is a miniature. And so all those imperfections, all that handcrafted nature comes through, rather than doing like mega scans, which is very realistic, we're leaning on the opposite world of that craft and that really like unique charm of miniatures. I think there's something in photogrammetry and NeRFs are closer, but definitely still digital where we can start pulling environments and pulling objects from physical world and bringing them into digital. And what are the mega scans? Or what does that mean? Unreal has a marketplace called Unreal Mega Scans, and it's a free library of scanned assets. So there's a team of people who go to Iceland and take a bunch of mountains and rocks and pieces, and then they put them out for free. So you can just pull them. So if you were like, I need to film tomorrow, you could use all those. Like the objects of the library of 3D objects, but these are actual scans of places or things in the real world. Trees, rocks, plants, it's things that you can use as a library to build in. And they're extremely high quality. They're like very, very, very high quality. Okay. And it's interesting, it seems like you were ahead of all these trends 10 years ago between virtual reality and photogrammetry, and now they've all re-emerged as trending things between NeRF and 3D Gaussian splats and everything around that. That's what we were talking about before. It's that constant repeat of things. And what have you been using project wise, you were- because you said you'd now your studio is mobile. So you're going around. So what kind of work and projects are you filming with like a mobile VP studio? So it's kind of the same concept of having a small warehouse with a screen. We've kind of taken that because we started that when the pandemic meant we couldn't travel. So I was like, oh, I don't have to go film in LA anymore. That sounds great. Let me just bring it here. So like we did a commercial for Jose Cuervo where they wanted to film in California and we just filmed it in Brooklyn. So that was amazing. Now, people are traveling again. They actually want to go to places. So the limitations of a warehouse that's not a true soundstage became too much of a problem. People were like I want my trailer here. I want this, I want that. So we said, OK, let's figure out how do we insert in other stages, which a lot of productions from like American Horror Story to your Star Wars, is they're gonna build out in different places, usually following tax incentive. So if the movie says, hey, we need to do in New Jersey instead of New York, we say, okay, find us a sound stage with adequate power and AC, and then we'll pop up in that stage. So right now, the one I can talk about is like a sneaker show, and the idea is to interview people about sneakers and be able to pop up LED screen depending on where talent is. So that's like we do a New York shoot. That's one week, LED stages up, get everyone on one week in New York, but then take that exact background, that exact setup, travel it to London or Portland or wherever else, pop up the LED screen in that stage, use the same backgrounds. So then when the final piece comes out, it looks like everyone filmed in the same place and it feels really consistent. Like a popping up your green screen, like used to, but now it's LED screen. Exactly. And you could use green screen for a lot of this. Like so much of it absolutely can be green screen. I think the big difference is like if it's reflections and if talent needs to see it. That's kind of the line that the absurdity of cost in production equates to. So if it's something where I need talent to point and talk about the thing behind them, like as if we were doing it on location, now LED is very justifiable. But if it's just for the sake of having background, just to be background, it's really hard to be like, oh, it's worth the extreme cost of LED when green screen is substantially cheaper. Right, okay. Especially with AI, I'm like, you can generate a lot of the AI backgrounds now. So I feel like green screen is even better. I've seen demos of people where it's like, they take talent and then generate background and then they can relate talent to background. And I'm like, oh, god. Yeah, that was the hardest part about green screen. Yeah, it hasn't aired yet, but some of the other interviews that we recorded, I've been talking about AI uses that are a bit more practical, like machine learning to relight people based on the environment. And I've seen those. There was a demo of a driving plates that was going around LinkedIn the other day. Yeah, I was taking the lighting from- They were shooting the car on a green screen, but taking the environment lighting and relighting the person. I mean, do you think that might, I don't know, replace some of the use cases of virtual production or be an alternative to virtual production? Yeah, I think it's all alternatives. I think this idea of virtual production is just using all of these tools to do production. Because when I first started in film, like if it rained, our day was screwed. Now that really is not the case anymore. There's so many different ways you can do a production, and technology is just changing how we make movies for an executive or for a really expensive talent. Their day on a blue or green screen all day is not fun. So LED for them is wonderful. They can see everything. They can do that. I think that you're paying a high premium for that. If you're just talking about quality, I think to what you said, the tools coming out with AI will change how we do that. I wouldn't be surprised if we just start filming stuff in like black voids and then it automatically can have visuals added to that. That's not that far away, but it's just thinking about, will talent like that? Will your crew be able to see that? Like for the single filmmaker or the individual who's just trying to accomplish that end product and not really care about workflow per se, these tools are gonna be perfect for them. It's just for the rest of the traditional way of filmmaking where it does require a lot of people. Those things I think are a different skillset and a different set of technology that come with that. Yeah, and that the black box comment makes me, there's like been a slew of new like iPhone apps where you can, it does like rough comping. I mean, a lot of the kind of the selling point is for previs, but you load in your scene or your unreal scene and use the phone as a tracker and it kind of comps out people so you can see stuff. But I mean, yeah, if you were a high school kid, that would definitely be awesome enough. I thought that would think that'd be awesome if I was back in high school doing this stuff. Unimaginable, like truly. We were working with Bill Warner and Conrad Curtis who started Avid. And he has been doing a tracking based iPhone solution called Lightcraft. Yeah. And that I'm like, truly that in high school, I do not know what my career would be because I would have been so into that. I would have probably just made movies immediately. I don't think I would have gone to college because I would have just been obsessed with being able to make movies that way. So I think it just changes that trajectory of what filmmaking is for people, because it does become like your thing. Like I needed to lean on people for visual effects because I didn't know visual effects at the time. I needed people to help do production design, and I still do because I'm not that confident in it. But I think if I had that ability in school to do all these pieces just on my phone, I probably would be like, oh, it's just like Robert Rodriguez running gun. Like, let me just do my thing. Yeah, rebel with an iPhone. Yeah, I mean, like, yeah, it's like someone's got to do a whole new version of what that looks like because it really is. I think I would probably see it. I mean, between that and, you know, Resolve is free. Like there's a free version of Resolve. There's a million tutorials on YouTube. YouTube alone would have been a huge resource because I was like tracking down books and things to try to like learn this stuff in high school. And it was like nearly impossible or like you had to pay for Lynda. Yeah. And Unreal is free, and there's a million Unreal tutorial. So yeah, there's just like so many things you can do to to figure it out yourself if you're high school or college student. And match the caliber. The fact that the Blackmagic app is using that same workflow as someone on a larger production like that for someone in high school is like you're getting used to shutter angle. You're getting used to aperture like you're getting used to things that same thing you would use on a regular. Oh, yeah. Like my little DV camera had like auto for everything so I didn't know what the hell anything was. Yeah, I think I had to throw NDs on the in front of it or something or try to like trick it to expose lower or to like- yeah. And you're just monitoring the screen. Like you didn't know like false color, like those things that exist. None of that exist. You just look at the screen and see if it looks good or not. Yeah, but then sometimes you go in the computer and be like, oh, that doesn't look good. Why are all my highlights blown out? Why is the sky pure white? I just can't color it. Yeah, anything else that's just been on your radar we didn't talk about? No, I mean, this was awesome to just talk. I think you are seeing these things in real time and talking to people who are producing them. So I mean, I'm so curious to kind of see the people you talk to and where things go as you're looking into this world. For me, it's just always been reactive of someone comes with a story or comes with an idea and we just say, how can we execute it? And I've just been fortunate to have people around me who are way smarter than me to say, oh, there's this thing called VR, oh, there's this thing called Stable Diffusion, or oh, there's this thing and people just using that. So I hope there's more people who are using these tools for that storytelling side because that's the thing that I feel like I've just always tried to preach is it's like doesn't matter what you're using. It's stressful to think about the amount of tools that are out there. Just does it serve that story that you're trying to tell? What you need to achieve. Yeah. Yeah, I appreciate a lot, Steve. Any websites or socials people can find out more or follow? Yeah, our website's thefamily.tv. Our Instagram's just @family, which is a story for another time. And if you follow Mollie and Max on Instagram, you can see an absurd amount of behind the scenes. Michael has been the most transparent person, so you could literally see how he made every scene of this movie. And yeah, Mollie and Max in the Future is on demand, I think, tomorrow. OK, sweet. Yeah. Yeah, so by the time this airs, it should be available somewhere. Yeah- Mollie and Max will be on VOD. All right. Thanks a lot, Steve. Appreciate it. Thank you. And that's it for this episode of VP land. Links for everything we talked about are available in the show notes. Just head over to vp-land.com. If you found this episode useful or interesting, share your thoughts over in the comments on YouTube and/or leave a 5-star review in your podcast app of choice. And be sure to subscribe to the VP Land podcast wherever you listen to podcasts so you don't miss a future episode. Thanks for watching. I will catch you in the next episode.