VP Land

VIVE Mars & ARwall: Making Virtual Production Accessible

December 20, 2023 Joey Daoud Season 3 Episode 5
VIVE Mars & ARwall: Making Virtual Production Accessible
VP Land
More Info
VP Land
VIVE Mars & ARwall: Making Virtual Production Accessible
Dec 20, 2023 Season 3 Episode 5
Joey Daoud

In this episode, we talk with Raymond Mosco (VIVE Mars) and Rene Amador (ARwall) about these 2 tools we discovered that make virtual production intuitive, affordable, and accessible.

📧 Be sure to subscribe to the free VP Land newsletter to get the latest news and BTS insights 2-3x a week: https://ntm.link/vp_land


Connect with Raymond Mosco @ VIVE Mars
YouTube - https://www.youtube.com/@vivemarscamtrack1612
Instagram - https://www.instagram.com/vivemars
Facebook - https://www.facebook.com/VIVEMARSVP
X/Twitter -  https://twitter.com/htcvive
LinkedIn - https://www.linkedin.com/company/vivemars/
LinkedIn - https://www.linkedin.com/in/mosco/

Connect with Rene Amador @ ARwall
LinkedIn - https://www.linkedin.com/company/arwallco/
YouTube - https://www.youtube.com/@ARwall
Instagram - https://www.instagram.com/arwallco
Facebook - https://www.facebook.com/arwallco
X/Twitter -  https://twitter.com/arwallco
LinkedIn - https://www.linkedin.com/company/arwallco
LinkedIn - https://www.linkedin.com/in/rlamador


đź“ť SHOW NOTES

How VIVE Mars CamTrack is Making Virtual Production Setups Easier
https://newterritory.media/how-vive-mars-camtrack-is-making-virtual-reality-setups-easier/

Virtual Production Made Easy – ARwall’s New StudioBox
https://newterritory.media/get-studio-grade-professional-results-with-arwalls-arfx-studiobox-a-complete-virtual-production-studio-in-a-box/

VIVE Mars CamTrack / VIVE Mars FIZTrack
https://mars.vive.com

ARwall
https://arwall.co

ARFX StudioBox
https://arwall.co


#############

⏱ CHAPTERS

00:00 Intro
01:05 VIVE Mars CamTrack @ NAB 2023
02:20 How VIVE CamTrack works
03:15 Timecode and Genlock support
05:30 Virtual production expertise needed
08:05 ARwall @ NAB 2023
10:42 ARFX Pro Unreal plugin
12:50 ARFX Lens
16:10 Price breakdown

Show Notes Transcript

In this episode, we talk with Raymond Mosco (VIVE Mars) and Rene Amador (ARwall) about these 2 tools we discovered that make virtual production intuitive, affordable, and accessible.

📧 Be sure to subscribe to the free VP Land newsletter to get the latest news and BTS insights 2-3x a week: https://ntm.link/vp_land


Connect with Raymond Mosco @ VIVE Mars
YouTube - https://www.youtube.com/@vivemarscamtrack1612
Instagram - https://www.instagram.com/vivemars
Facebook - https://www.facebook.com/VIVEMARSVP
X/Twitter -  https://twitter.com/htcvive
LinkedIn - https://www.linkedin.com/company/vivemars/
LinkedIn - https://www.linkedin.com/in/mosco/

Connect with Rene Amador @ ARwall
LinkedIn - https://www.linkedin.com/company/arwallco/
YouTube - https://www.youtube.com/@ARwall
Instagram - https://www.instagram.com/arwallco
Facebook - https://www.facebook.com/arwallco
X/Twitter -  https://twitter.com/arwallco
LinkedIn - https://www.linkedin.com/company/arwallco
LinkedIn - https://www.linkedin.com/in/rlamador


đź“ť SHOW NOTES

How VIVE Mars CamTrack is Making Virtual Production Setups Easier
https://newterritory.media/how-vive-mars-camtrack-is-making-virtual-reality-setups-easier/

Virtual Production Made Easy – ARwall’s New StudioBox
https://newterritory.media/get-studio-grade-professional-results-with-arwalls-arfx-studiobox-a-complete-virtual-production-studio-in-a-box/

VIVE Mars CamTrack / VIVE Mars FIZTrack
https://mars.vive.com

ARwall
https://arwall.co

ARFX StudioBox
https://arwall.co


#############

⏱ CHAPTERS

00:00 Intro
01:05 VIVE Mars CamTrack @ NAB 2023
02:20 How VIVE CamTrack works
03:15 Timecode and Genlock support
05:30 Virtual production expertise needed
08:05 ARwall @ NAB 2023
10:42 ARFX Pro Unreal plugin
12:50 ARFX Lens
16:10 Price breakdown

Joey Daoud:

Welcome to another episode of VP Land, where we dive into virtual production, AI and filmmaking, and all the other latest, cool tech, and updates that are changing the way that we're making movies. I am Joey Daoud, your host. In this episode, we're taking a little break from our series of interviews that we've been having. And we're going to dive back into our NAB interviews that we did in April of 2023. We've got a two-parter in this one. The theme is focusing on virtual production companies that are making virtual production more accessible, more affordable. So we're gonna focus on two products, the VIVE Mars CamTrack and ARwall. And just a reminder, if you like content like this, if you like updates about the latest news in virtual production, behind the scenes insights, be sure to subscribe for the VP Land newsletter. It covers a whole lot more stuff than we cover in the podcast. You can sign up for that at vp-land.com or just Google VP Land. So first up, let's cover the VIVE Mars CamTrack. In this interview, I speak with Ray Mosco, from VIVE about the VIVE Mars CamTrack. And I'll jump in every now and then to paint a picture of what the visuals are that we are talking about. But let's jump into the first one. And then after this, we'll talk about ARwall. Hey, I'm here with Ray from VIVE Mars CamTrack, and we're going to talk about virtual reality setups and virtual production. Hey, Ray.

Raymond Mosco:

Hey, how's it going?

Joey Daoud:

Good. So yeah, can we walk me through what we got set up here?

Raymond Mosco:

Yeah, absolutely. About this time last year, we did a sneak peek on VIVE Mars, which is our sort of offering to the camera tracking solution, HTC VIVE. We're traditionally a virtual reality company. But we have this long legacy in Hollywood or in production of people using our tracking markers, our tracking blocks, our base stations to track people, to track props, to a variety of different things. And so we took in that information and we built a a professional-class camera tracking solution called Mars CamTrack that is a much higher quality tracking experience, as well as offering some of these more pro features that are required to be very successful either in a green screen or on an LED volume. So Mars CamTrack, it costs$5,000, and it is a few different pieces. So we have our rover tracking unit. So this consists of the actual Vive tracker that's been on the market for a long time. It connects to the rover tracking box and this pushes all the data via Ethernet back to the actual Mars box. And then here's where you can see all the kind of information associated with the camera tracking itself.

Joey Daoud:

All right. So just to jump in and describe what it is we are looking at, we've got the actual Vive tracker. So this is the actual hockey puck size tracker that's been on the market for awhile for other AR uses, but it works great for tracking your camera placement. So we've got this on top of the camera. And then we have a new device, which is basically just a black box a little bit bigger than the puck. And it's just got a variety of Ethernet port plugs on the back. And so the Vive tracker is plugged into this black box. They are both mounted on each other on top of the camera. And then running out of the black box is a long Ethernet cable, which has been plugged into the Vive rover tracking box, which is a thick iPad-sized box with big touchscreen display. And so this is where all of the trackers are getting plugged into. And this is what's doing all the processing for all the positional data and then plugged into the computer is sending that data out to the computer.

Raymond Mosco:

You can see if you're using timecode, if you're using genlock, you can see that information there. Right now, we can see that we're using here with genlock 2996. You know, we have some teams that use timecode instead. And then you'll see the status of the tracker units as well as the base stations. VIVE Mars supports up to four base stations. It ships with two initially. And we can support a tracking volume of about 10 meters by 10 meters. And in the Mars kit, it comes with three rovers so you can track up to three devices. We find traditionally a lot of our partners are using one of our rover trackers on the camera. A lot of times, one will be used to the origin point of the virtual environment that you're using. And then we found teams that use the third one for a variety of different things, either potentially to run two different frustums at the same time, or potentially using it for like a lighting gag if you want to like tie a physical light to a virtual light and be able to wash it over someone and also have that reflect in the virtual environment itself as well. You know, we're finding that the sky is the limit as it relates to that. And then all that data, it all kind of passes through this box into either Unreal Engine, the Live Link. We also support 3D, which is more legacy protocol. But that allows us to support applications like Aximmetry. All the positional data calculation occurs on the Mars box itself, so that the only thing going into your environment is just the data stream of the positional data, rather than the actual compute itself. So you're not compromising any sort of rendering on the actual environment PC because all the calculation happens on in the Mars box itself

Joey Daoud:

And when you're going into wherever you want to send your signal, is it the data going through the box and then your video feed is going and to have whatever capture card?

Raymond Mosco:

That's correct. That's correct. So you'll see here, we have a capture card and that data is being pushed directly into Aximmetry. And then the positional data is being pushed via Ethernet also into the the PC. And then, in your software, either in Unreal Engine or in, say, Aximmetry, that's where you connect the two together.

Joey Daoud:

Okay, cool. Yeah, it's fascinating because like, probably you've heard of it,, I tried to do the DIY route, like buying the puck, buying the trackers, and then getting it to cooperate with the computer. Because also like before, you need to have a headset or you have to like kind of jailbreak it.

Raymond Mosco:

That's right. That's right.

Joey Daoud:

To work around that. So yeah, it seems to make it a lot easier. I guess what sort of level of 3D expertise you need to have to- or VR expertise you need to have to use this?

Raymond Mosco:

We're finding that a lot of folks who have not actually explored VR at all are finding very easy use for using Mars CamTrack, you know. The benefit of it is, especially if you're familiar with Unreal, because you're just pushing a Live Link data stream, all you have to do in Unreal is just find that data stream and attach it to your virtual camera. And then the physical camera and the virtual camera are connected. So it's very straightforward. One of the benefits of kind of Mars is that it sets up really fast. You can set up this whole configuration in about 15, 20 minutes. It's very portable. We have teams who are putting all this in a Pelican case or in a backpack, getting on an airplane, going somewhere, and being able to set this up very quickly and kind of go straight into production.

Joey Daoud:

And I believe it's out yet, but you also have some lens trackers coming out.

Raymond Mosco:

Indeed, so the other kind of special sauce of NAB 2023, the missing piece of the equation that a lot of our partners have told us is the interest of being able to utilize lens encoders. And so Mars utilizes USB, and we found that there weren't a lot- the accessibility to USB lens encoders is very limited right now. And so we found that it would be beneficial if we essentially released our own. And so here at NAB 2023, we're giving a sneak peek of FIZTrack, which is our lens encoder. It allows you to push FIZ, so focus, iris, and zoom directly into the rover unit, and then that data goes directly back into your comp.

Joey Daoud:

Cool. And then that stuff all integrates in that one central, the rover, the hub.

Raymond Mosco:

That's correct.

Joey Daoud:

So you can plug it on there. Cool. Where can people find out more information?

Raymond Mosco:

Yeah, absolutely. So go to mars.vive.com to learn everything about Mars and to order one today.

Joey Daoud:

And just an end note, the VIVE FIZTrack is actually available now so you could go check out their website and check out the VIVE FizTrack to get to both camera data tracking and lens tracking. All right. So now let's jump over to ARwall. So ARwall, a company created by Rene Amador, and that's who I speak to. They're billing it as virtual production in a box. So it is a small, tiny computer that runs Unreal and runs their software that we'll talk about, which makes Unreal a lot more accessible to filmmakers and people who are not Unreal experts. And it could generate a 4K scene and they could load onto an LED wall, a rear projection, or a TV screen. This took me a second to understand that we're not talking about green screen so you'll hear my confusion and then clarity that it is about doing in-camera VFX for virtual production, not a green screen composite, which was the demo that we had at VIVE Mars tracker. And that was sort of where the confusion came into play. All right, so let's jump into the interview with Rene. Rene, let's talk about, we've got virtual production, some cool kits here. What do we got going on?

Rene Amador:

Yeah, so if you don't know ARwall, we're a virtual production provider, one of the top in the world. We've done over 100 deployments, all in-camera effects. So we're focused on in-camera effects and those types of real-time backdrops at our company. And what we're here showing off at NAB, two things. Basically, we're celebrating some of the work that we've done recently, including Muppets Haunted Mansion. So over 70% of the shots of Muppets Haunted Mansion utilized real-time backdrops and Unreal Engine. And we're really proud of that. The result of that work that we did on that show was a professional plugin called ARFX Pro plugin. So we've been commercializing this for a little while. So showing off that. But really, the most interesting thing is, at the beginning of the show, we started taking pre-orders of ARFX StudioBox. So we're saying this is an entire studio in a box. This is it right here. It is a little guy. But if you can believe it, this little guy here is just as powerful as what we were deploying on professional sets just five years ago. That's the power of processing for you, right? So we're taking pre-orders on this now. Pre-orders are$4,199, or$380 a month with our financing partner. And this uses your smartphone as a tracker. At launch, we'll have iOS and HTC Vive support. Yes, that includes the Vive Mars. And then, soon after, Android and Antilatency support, we're also looking at lighting kits, maybe some robotic camera heads that would be integrating with this as well. But the most important thing is that this comes with an ARFX app, which is a new standalone tool that we've built at ARwall that's specifically targeted at filmmakers who want to have a no coding, no scripting, and even no 3D design skill type solution for virtual production. So what am I saying here? This box can do up to 4K resolution on a single stream for virtual production. You have your creative content already pregenerated for you with different lighting iterations, color. Everything that is in our professional toolset is actually ripped out of Unreal Editor and placed into the map itself. So all the virtual production tools that you need, color, everything like that. So this box is capable of delivering studio-grade professional results. And you load it, and you're in Unreal Engine, and that's really how easy it can be.

Joey Daoud:

All right, cool. So walk me through how you get the box, what's the setup going to be like. So you've got your box, this is running Unreal.

Rene Amador:

Yes.

Joey Daoud:

Can you walk me also through, like, what's the ARFX? What does that bring the plugin to Unreal?

Rene Amador:

Absolutely. So ARFX Pro plugin, what this does is it takes the tools that are normally stuck in Unreal Editor, in the actual development program Unreal Editor, and it puts them in-engine into videogame-style menus. So they're very familiar to- would be very familiar to a filmmaker because either you've played video games, or they're also similar to camera menus. So basically, you're selecting your color or your camera settings, very similar to that type of traditional menu system. The fact that it's in the engine makes it so that you don't need to learn any of the traditional development tools associated with virtual production.

Joey Daoud:

Okay, yeah, so can you clarify? Because actually- because I've dabbled with Unreal.

Rene Amador:

Yeah, go for it.

Joey Daoud:

And I've like loaded Unreal and then I've like tried to take the courses but then realized like this is like, you know, an entire- to actually build things. So when you're saying running the Unreal engine, that's like you're running a video game, right?

Rene Amador:

It's like you're running a video game, you're playing a video game, but the game you're playing is making a movie. And so you select your weather, your sky, your lighting, you even have animation to use.

Joey Daoud:

Easier interface, not having to like dig into the million panels.

Rene Amador:

Exactly. you can use keyboard, mouse, or you can use an Xbox controller. But the most interesting thing is your smartphone is the tracker. You don't need any additional hardware at all to do this. And your smartphone can also be the remote control to the system as well. So everything that you need, basically, you already own, except this box. This is the last piece.

Joey Daoud:

Cool, and so you would mount your smartphone to your camera to track it.

Rene Amador:

Exactly.

Joey Daoud:

So it's now like you're recording on your phone.

Rene Amador:

Exactly.

Joey Daoud:

You are using it as a separate tracker for your camera.

Rene Amador:

Now that is coming at a later point. We're going to unlock the ability to do tracking, recording, and AR overlays using your smartphone. So that'll actually allow you to not only use the screen as the composited CG backdrop, but also have an AR overlay that extends the edges of the screen in that virtual environment. So basically, as you pan off the edge of the screen, an AR layer would fill in the rest of the scene.

Joey Daoud:

Okay, so you kind of have like a set extension, you could have like a small little green screen, but then still be like in a bigger space.

Rene Amador:

Yes.

Joey Daoud:

Look like you're in a bigger space.

Rene Amador:

Yes. And this is something we've already done at the professional level. We're showing it off at the Canon booth. That's called ARFX Lens. It's a plug-and-play lens emulation solution. And what we're seeing is that solution, we're going to get that into the smartphone. That's kind of the last step here, having everybody have access to these tools off of their existing hardware.

Joey Daoud:

Cool. As far as your camera feed, do you have to bring it back into the box to composite realtime or-

Rene Amador:

Excellent question. So the way that we're perceiving how to do virtual production, that's actually not necessary. We have the tracking data. And then on the smartphone, we would have all the optical data needed to complete that composite, and that's it. That's everything that you would need. So there's no feed that is required to come back into the virtual production box.

Joey Daoud:

So then you would just composite it and you're editing.

Rene Amador:

You're compositing literally in-camera. As you're recording, you're getting final pixel.

Joey Daoud:

Walk me through how that- let's break that down.

Rene Amador:

So there's going to be two phases. The first phase is its in-camera effects. So you've got the tracker on top of the camera. You calculate the offset from the tracker down to the sensor or the nodal point of the camera. As you're shooting against your backdrop, everything will be captured in-camera, in-lens, just like Mandalorian or the work that we've done for Muppet Mansion or everything like that.

Joey Daoud:

Are you filming on a green screen or are you filming on a wall?

Rene Amador:

No, you're beginning to be using a 4K TV, a projector, or an LED video wall.

Joey Daoud:

OK, so that helps clarify. I was thinking green screen, but we're not talking about green screen, so we're talking about, what is it something like-

Rene Amador:

In-camera effects. Yeah. So this has been our message to independent filmmakers is, look, green screen has existed for a while. And there's many people that have been able to work and make those results incredible. We all know that, right? At the independent level, at the professional level. But here's our message. Doing in-camera effects, the hurdle to get to professional results is a lot lower than green screen. There's also less processing power involved because you don't have to composite and render on the device. It is just doing the rendering alone. The compositing is happening in-camera. So our argument is that this is actually a more accessible version of virtual production than green screen.

Joey Daoud:

OK, got it. Yeah, that makes a lot more sense. And because you're not- or it's not running the- I don't know how to differentiate. It's running the game version of Unreal.

Rene Amador:

Yeah, it's rendering the Unreal scene.

Joey Daoud:

What are the set options? Like, can you kind of go to Unreal's library and bring stuff in? Or is it sort of like a set that ARwall has like for locations and look, scenes?

Rene Amador:

Yeah, so when you go to the Unreal marketplace and you download those scenes, there's still a good amount of development, lighting that needs to be developed for those scenes, and that's where people get hung up. When we sell to film schools, for example, one of the problems that happens is they've got these great LED screens. They've got the software, everything. But then there's only one or two filmmakers in a class of 20 that feel comfortable in Unreal Engine. So then they end up only trickling out projects a little bit per semester using this technology. So this solution of using pre-lit, pre-configured, pre-polished scenes specifically for virtual production meant to jump over that final hurdle, which is you've got the tech in your hands. Now, how do you solve the creative content problem, and this is a solution for that. So we have our own marketplace of pre-compiled, pre-configured, pre-lit scenes.

Joey Daoud:

Okay. Cool. And once more with the pricing breakdown of how and availability window.

Rene Amador:

Yeah, so this is available to pre-order right now beginning at NAB. And it is$4,199 for the pre-order pass, about$380 a month with our financing partner. If you pre-order, it will also include a one-year subscription to the Essential Scene Pack on the marketplace. That's going to get you started. Within one hour, you're going to be shooting with this thing.

Joey Daoud:

All right. And that is it for, this quick episode of VP Land. If you want to actually see the interviews, these are pulled from videos, so we have all the videos of these over on our channel. So you could go check that out. We've got all the links in the show notes for wherever you're listening to this in your podcast. If you enjoyed this and you are not subscribed and you want more content like this, be sure to subscribe in your podcast app of choice, and then also head over to VP Land, the newsletter, to get our twice-weekly email in your inbox. Thanks again for listening, and I'll catch you in the next episode.