STEREOSCOPE
Welcome to the STEREOSCOPE Podcast, the place where we dive deep into everything immersive video. From VR180, 3D360, Spatial Video, Volumetric to Photogrammetry, we cover it all. Our show is dedicated to covering the latest news, best practices, and workflows that are essential to the immersive video community. The VR industry has been a major force behind the rapid growth of this medium and we are excited to showcase how it impacts immersive video. Every episode, we feature two videos created by our talented community members to inspire and showcase the amazing work being done in this space. Join us on the next phase of cinema as we gaze through the STEREOSCOPE.
STEREOSCOPE
Meta Approaches Hypernova! With James Cameron, Apple Immersive, and Blackmagic's $30K Camera Revolution
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
We explore the evolving landscape of immersive technology, from James Cameron's partnership with Meta to Blackmagic's groundbreaking new camera system. The tech world is shifting toward more integrated, powerful tools for creating and experiencing immersive content.
• James Cameron acknowledges VR headsets as the optimal medium for 3D movies, eliminating issues like crosstalk and brightness loss
• Meta's "Hypernova" smart glasses coming in 2025 ($1300-1400) with a monocular display in the lower right corner
• Apple releases Immersive Video Utility for importing and reviewing Vision Pro footage with multi-device playback capabilities
• Third-party developer Ivory enables eye tracking on PSVR2 for PC, allowing for dynamic foveated rendering
• Blackmagic unveils the $30,000 Ursa Immersive camera with dual 8K sensors, 90fps recording, and complete Apple Vision Pro integration
• The camera's metadata-driven pipeline preserves quality through the entire post-production process
Episode Introduction
Speaker 2Hi, Welcome to the Serious Goat Podcast number 13. Lucky number 13. Ooh yikes, I know right, so it's a little bit of a lighter news week. I think everyone is sort of terrified about what's going on around us to really do anything.
Speaker 1I'm terrified beyond the capacity for rational thought it's a healthy place to be.
Speaker 2Yeah, the slores, and all that being roasted.
Speaker 1No, he's actually. You don't know what we're talking about. You'll never know what we're talking about.
Speaker 2The Gen Xers probably do All right, so I'm Byron.
Speaker 3I'm Anthony.
James Cameron and Meta's 3D Video Future
Speaker 2I'm Sean and we have a. Yeah, like I said, we have a little bit of a lighter episode, but we have some more in-depth topics this time and, yeah, so we're going to start it off with James Cameron and Boz, the CTO of Meta, talk about 3D video on Boz's podcast, boz to the Future, sort of a podcast, talking about a podcast. You know podcast-ception. How meta can you get? How meta can you get? Indeed, can you get, indeed? Um, so this one is interesting because obviously, james cameron is a luminary in the 3d field.
Speaker 1He's been around and four million, four million years yeah, um, I mean the we.
Speaker 2we talked about him a couple episodes ago, about how he got started with T2 3D back in the day and then obviously went on to Avatar and the never-ending Avatar sequels, relevant because meta earlier in the or late last year announced that there was going to be a partnership between meta and james cameron's company, light storm. Um, and through time we've sort of figured out that lightstorm is a 3D, so not immersive video, but like what Apple calls spatial video, which is just side by side.
Speaker 1Once again. If we go back in time, you rewind the clock from like. If we look at the terminology evolution of virtual reality, it has a lot of names Virtual, mixed spatial.
Speaker 2Yeah, and a lot of the names that people use are conflicting and competing names.
Speaker 1Which is why we are so here for you and so confused.
Speaker 2But we help you define reality effectively. So what is James Cameron?
Speaker 1So what Light Storm?
Speaker 2Entertainment is doing and why this is relevant is because they're trying to build some sort of pipeline sort of system to make 3D video more accessible. Because, for James Cameron, what he realized, and what this video, the podcast, is largely about, is him coming to the realization that headsets are the end game for the types of content that he creates and that, watching in a theater, he talks about the experience of authoring this content on a perfect Perfect resolution, perfect fidelity.
Speaker 2Yeah, like a Dolby Vision screen, basically a laboratory. A laboratory with the best luminance values and the brightest screen that you could possibly get for watching 3D movies. And so they're in the lab watching these 3D movies in an absolutely optimal scenario, and then they go into the theater.
Speaker 1Then they go into the walk down to the local Regal, get their popcorn and go. I can't see.
Speaker 2Yeah, like, and obviously that was a very frustrating experience for him. And so when he first found that VR headsets were being used to watch 3D movies, there was an aha moment, and now he has a new target used to watch 3D movies. There was an aha moment and now he has a new target market for his 3D movies. And, as someone who you know, I've always been a fan of 3D movies.
Speaker 2The first 3D movie I ever saw was Captain EO 3D, back at Disneyland in the early 90s, before it was even um, honey, I Shrunk the Kids 3D, which was also one of the first 3D movies I saw. And then, uh, living in San Diego, going to the Ruminate Fleet Space Center, they had 3D movies, um and so, and then eventually seeing T2 3D. I'm a big fan of 3D movies, but I've always felt the experience of watching one at a movie theater except for very, very specific circumstances, of which I'll get into was incredibly disappointing, especially since the type of 3D and the type of rotoscoping that they were doing, which is, for anyone who doesn't know, we'll get to that Rotoscoping is the process by which you create a 3D depth map from or fix 3D depth maps for times when the stereo isn't completely perfect, so a lot of 3D movies are fixed in post, but the worst ones are completely done in post. They're not shot in 3D, historically.
Speaker 1Example Good example.
Speaker 2Clash of the Titans. That is a famously bad 3D post-conversion.
Speaker 1Not the best adaptation of a classic work of Greek tragedy.
Speaker 3You can tell when you're watching a badly done movie like this doesn't look right and what you can tell is that there's not a sense.
Speaker 2So the worst 3D post conversions. It feels like there are distinct layers but there is no actual depth. There is no actual depth is no actual depth, it's just like. It's like a paper cut out in front of a paper cut out um with no, with no uh nuance to the depth um class of the titans being the most, uh, widely maligned example of this um. However, when you get true shot 3d with you know, bespoke three separate cameras, usually that's pretty good, but sometimes they still need to be cleaned up with rotoscoping. So what Cameron is talking about in this is partially that that process of rotoscoping is incredibly time consuming, and so Lightstorm Entertainment is sort of working on this merging of post-production tools and shooting with bespoke 3D images, and he's talking about using AI to sort of merge the fields yeah.
Speaker 2And there are some tools that already exist that, uh, we've tried out.
Speaker 2Um, yeah, you've used it more than I have yeah, owl 3d is the only one I've actually used owl 3d uh which works really, really well until it doesn't, uh, but it mostly works really well it seems to me that, uh, because I I tested it recently on um, uh, revenge of the sith, and it worked for certain shots, but then there were some shots with like smoke and that type of thing right, which it absolutely didn't know how to do, and then created weird artifacts or like depth that shouldn't have depth, or the smoke had like weird holes in it and that type of thing it's very interesting like so my experience.
Speaker 3More specifically, so was doing like actual scenes and things like that that I'd shot in 2d. Anything that was hanging or protruding off of the body would get thrown off, like if you were wearing glasses and you turn this oh okay, yeah, or like this is a glass, like this hair strand right here, yeah, depending on.
Speaker 3It just depends on how it is. It always was foreign objects, like a bracelet hanging off of somebody's wrist. It gets sent to the background. Same thing with part of the glasses, and so you have this break. But outside of that, if there's no things hanging off the person, the depth is dramatically nice. It's very pleasing actually.
Speaker 1So your lightsaber would be an issue? Yeah, probably.
Speaker 2Oh, and I guess I got off track there for a second. What Cameron was saying was that the 3D in theaters isn't great. That's why they need rotoscoping and working from getting a distinct Feed per eye. That's the big advantage One image per eye, exactly.
Speaker 3The big advantage is that you're not Like in theaters if you don't know watching 3D movies, or even at home watching 3D movies. You don't have crosstalk, yeah, you either have these polarized glasses or shuttered glasses. Which is the best version of it? Either way, you're trying to force your brain into thinking it's 3d based on like doing this basically very quickly, yeah, and it always creates eye strain.
Speaker 2It never really works that well and, and it also every, every other method increases the brightness, dramatically decreases or sorry, decreases. So this is the other big problem yeah, but with a vr headset you get the full image, you get the full brightness.
Speaker 3It's an individual feed per eye, so you just don't have the eye screen.
Speaker 1So wait, wait, wait, wait, wait. Back to the T800. James Cameron just got on Boz and talked about his new workflow, his new vibe, full picture, both eyes.
Speaker 2Well, that's why he's seen VR as the future is because it's the optimal way to watch it.
Speaker 1In a theater?
Speaker 2No, not in a theater.
Speaker 1It's not optimal In James Cameron's vision.
Speaker 2The best 3D that I ever saw was at a Dolby Vision theater when Star Trek Beyond came out in 2016. It was the first Dolby Vision 3D I'd ever seen and it was incredible Because Dolby Vision uses laser vision and they use two distinct projections, so one per eye Right so it has the maximum brightness you can get and it was truly immersive. However, it was still using polarized glasses, so there was a little bit of crosstalk Eye strain. Yeah, it looked amazing, but then I saw VR versions of 3D movies and I was like, oh wow.
Speaker 3This is so much more Similar thing, but with gravity, gravity was the one that was like, wow, this is great, this is well done in 3D, yeah, but still you kind of have a little bit of a headache at the end.
Speaker 2I mean it is interesting to hear how much talk about AI was in this. I mean, it makes sense because James Cameron has invested in some AI companies, has invested in some AI companies and obviously Meta has their own AI implementation, which is they are, as everyone is, are leaning very hard into. And there is some talk in this video about uses of AI for creative purposes. I'll be very honest I do not agree with their where they land on.
Speaker 3With generative.
Speaker 2With generative. Yeah, I think there was a little bit more nuance in there than I expected them to go with, but I still vehemently disagree with the way that they sound like they're okay with using it.
Speaker 1And you can vehemently disagree, as well, with the link that we may or may not provide in the comments.
Speaker 2Yeah, I mean, everyone has a lot of different takes on it. I mean being creatives. I think we have very specific takes on it, Certainly, I don't know if you want to get into it here, but I think I mean we've made our feelings known over the past several episodes.
Speaker 3People can figure it out.
Speaker 2I just generally don't think that we should be using AI to create art unless it is in an assistant mode.
Speaker 3Yeah, it's the way that it works right now too, especially because it's not really AI, it's like artificial, it's like intelligent copying, really.
Speaker 2And that's some of my biggest issues with what they get into is that they talk about the inputs rather than the outputs and they don't have very nuanced ideas about the people that they're hurting with stealing the input right, yeah, yeah.
Speaker 3But I mean really, like we think about the current, what we keep calling ai is really like brute force, creativity, yeah, and so you have to like you know, boil the ocean. I mean, what was uh? Sam altman recently said it costs oh yeah tens of millions of dollars because people say please and thank you.
Speaker 2So maybe the model is not so good if it's that inefficient yeah, and well, mind you, I'm guessing that's like because hundreds of millions of people are asking it, that oh I mean it is.
Speaker 3I mean I don't think they actually have that many people using. I think it's just like literally throughout the entire world. They do have hundreds of millions of gpus all burning, yeah right, okay, maybe tens of millions, yeah, because I don't think their, their user base is certainly our politeness is around the whole world, maybe not in the united states, but like I mean.
Speaker 3The thing is, there's a lot of other ais out there, though, right? That's true, yeah, and it seems like their people are a lot more polite sure, yeah, but it's just a good example of how inefficient the whole thing is. I mean getting into deep seek, I mean they proved that you could do almost the exact same thing with like a tenth of the resources so yeah, but a lot less of the, the, the training wheels and safety guards how so?
Speaker 2like oh, that there was. There was a lot of evidence that DeepSeek doesn't have the safety guards that the other models do. It could be propaganda, but yeah, who knows? I mean that's the thing.
Speaker 3They need to besmirch it because they're like wait a minute.
Speaker 1Yeah, it speaks to a point that deregulated, unregulated AI in an industry that's struggling Seems like it could Skynet us back to the Well, it's going to Skynet, the industry for sure.
Speaker 2I mean, we're already seeing that.
Speaker 1Yeah, yeah, we're already inside the matrix.
Speaker 2Yeah, so I mean. So it was just interesting to see this. I think there's a lot of good that they're talking about this video and some stuff where I'm just like, come on, guys like James Cameron should know better than this. In some ways, boz, I'm not surprised at all, but it is especially when they're talking about the filmmaking specifically. I think it's very interesting to see how much they're relying on the VR headsets to sell 3d video specifically, and I think we realize that these headsets are gonna be used more and more over time for video watching.
Speaker 3I mean, that's why this podcast exists, yeah it's just, it's good to see james cameron has finally reached the conclusion that we did four years ago.
Meta's Smart Glasses with Display
Speaker 2Yeah, exactly and you know um welcome. Welcome, james, we helped, yeah anyways, uh, anyway, that's the future of uh I think, I think we could probably move on um we're, we're going with a little bit more meta news we're gonna move on from meta news to more meta news.
Speaker 2So this is uh sort of not surprising. It's a. It's a little bit like uh our going back to our uh like xr glasses video from a couple months ago um meta's new smart glasses with a display. So we first heard they're called the product name, the code name. The code name is Hypernova. Sorry, I'm fighting a cold so my head is a little swimmy.
Speaker 1And soon.
Speaker 2I will have that cold yeah we will all. It's our cold, the cold. Yeah, so they're expecting it to launch in late 2025, between 13 and 1400 bucks. That's a lot of money. Yeah, that's like as much as a smartphone.
Speaker 3Pretty much, yes not pretty much like that is up there with a flagship where our flag means death but supposedly they're heavily integrated with ai and they're also going to have cameras.
Speaker 2They're also going to have like uh, gesture based controls. But they'll also have like sort of like swipey UI, sort of like Mac doc or like if you've ever used the Oculus interface, I doubt that it'll have hand tracking but you'll, you know, swipe around on your temples or anything like that. But there's the reason why this is interesting, is good, is because there's going to be a monocular display in the lower right-hand corner. So I was sort of wondering when Meta would ship a display glass, glasses, form factor, because all the other companies seem to be having a lot of very, very quick success, specifically like X-Reel, yeah, with these glasses form factors, and obviously the Meta Ray Bands have been a gigantic hit for them. I didn't think it would be a monocular display.
Speaker 1Why not? Why not monocular, which is the opposite of binocular?
Speaker 2Well, because why do you want monocular when you can do binocular, which is the opposite of binocular? Well, because why do you want monocular when you can do binocular, like it's like they're trying.
Speaker 3They're like just don't call it google glass please.
Speaker 2Yeah, well, and that's the thing is that like the it when, when you think about it, the closest thing to it is the google glass, which had, you know, famously, uh, a small display up in the upper right-hand corner which launched. What was that? 2011?
Speaker 1Definitely 2010? Yeah, three, somewhere around there, somewhere around there, yeah, somewhere around 2010.
Speaker 2But then was like the glass holes thing, which I don't think is really going to be as much of an issue these days.
Speaker 3I mean it helps that they package them up nicely. Right Like these don't look going to be as much of an issue these days.
Speaker 2I mean it helps that they package them up nicely right Well, and also we live in a world now where everyone has their phone camera out almost at all times. If you see a Gen Z that's out in the world, there is a very good chance that they have their camera out.
Speaker 1So a monocular display on the right side. Is it configurable? Can you put it on the left if you're like?
Speaker 2oh gosh no. So, it's going to be hardware baked in.
Speaker 1Oh no, that's going to definitely take out a certain percentage of.
Speaker 2But I do think it is. I think they're trying to figure out what is the potential market for these display glasses. What is the potential market for these display glasses? How is it going to work and really dip their toe in before they'd go full binocular, because once they get into binocular there's going to be an expectation there. And also these glasses seem to be uh, these aren't like video glasses, right? These are smart glasses that have a display yeah, it's more.
Speaker 3Yeah, it's designed around input, not output yes, exactly so this is.
Speaker 2This is like integrating social features into your life, having a like a small heads-up display for like turn by turn directions, stuff that we talked about on the other, on the other episode.
Speaker 1But this is not like the x reels where you're watching movies on them, or uh, yeah, this is for like driving, working and and like being in, like yeah, uh, I don't imagine unimagined being like used for door dashing not not even joking, because there are hundreds of times where I've seen like a door dash delivery guy, girl, gal, robot kind of just not be able to find the door because it's dark, it's, there's a complex, you live in a nice development and it's really difficult to find that, you know. But if I could just be, if you just less wandering in the parking lot, more just hot food, I think that's what this they're trying to do. I mean, that's the market as I see yeah, among others.
Speaker 3I mean I saw one video the other day that was like. It was a school bus driver who was like, uh, constantly dealing with cars passing with the. You know they had a little stop sign out and people still blowing by and they're like, oh, record it with my meta glasses and it was like really interesting because it's really more like a dash cam for your face.
Speaker 2Yes, yeah, yes, you know, I see I've been seeing the implementation of the meta Ray-Ban glasses. A lot of people at conferences or at events where they're not allowed to bring in a camera, everybody still brings their smartphone. But, uh, yeah, exactly. Well, uh, yeah. So like the nintendo switch 2 launch happened, um, in new york city a couple weeks ago and uh wolf den, one of my favorite like uh video game youtubers uh, they weren't allowed to bring in cameras but he brought his Ray-Bans and just recorded him his interaction with the console, so he recorded himself playing it and the footage looked great. It was a very good implementation of that tech for this specific use case. Right, right, yeah, it makes me sort of wonder, like I wonder if we could get stereo. Are there any stereo versions of these?
Speaker 3I mean, it's a little bit more challenging because you, by nature of it, especially like this type of design. Yeah, your, your ipd will always be off. It'll be too.
Speaker 2It'll be a little bit too wide, yeah. So, unless you put, which would kick cameras here or here, create a hypo-stereo effect.
Speaker 1That's where their roadmap, which was leaked earlier this year, kind of reveals that. All those fine-tune, all that data that they need to figure out what what is the solution to the fact that we all have different eyes, faces and brains? If the entry barrier is, I don't want to put those, because I hear people throw up all the time and you can say no, no, no, no, no, this is different. Then okay, maybe I'll give it a shot. Why? Because I hate looking at my maps, or I want to go into Comic-Con, but I don't want to take off these prescription glasses.
Speaker 2It's funny that you mentioned comic-con, because when I was at the star trek convention last year, uh, the first night that we were there, I saw a guy wearing meta ray-ban glasses. They did, they say, and he was like I'm recording right now, just fyi, so you know. I was like oh whoa, okay, that's now just FYI, so you know, I was like oh whoa.
Speaker 1Okay, that's cool. Did he go to see the trailer for Predator?
Speaker 2Well, this was at the Star Trek convention, not Comic-Con. Okay, so this was Vegas, the real one, I mean, comic-con is real, anyways moving on. They both exist in the same universe. So supposedly their meta is already working on a successor to the Hypernova which will have a binocular display, so they are shooting for it long term. But apparently they just haven't cracked that code yet, which I don't know they're also working on some Oakleys right, they're like a $100 billion company. I feel like they could do it.
Speaker 3Yeah, well, spending all that money on AI instead? Yeah, all right.
Apple Immersive Video Utility
Speaker 2Yeah, I guess probably time to move on. Okay, so this is Excuse me, this is interesting Apple immersive video utility. So we covered something sort of similar to this a couple months ago, but that was like an external right, like a third party.
Speaker 1Yeah, that was, that was a and it was.
Speaker 2It was an Apple vision, yeah, but, and it was an Apple vision pro app. Is it the spatial or is it a different thing? No, it was the one. It was like CapCut for yeah.
Speaker 1Which episode was that?
Speaker 3I don't remember, darn it. I think it was 11. Yeah.
Speaker 1Okay, algorithm. Please link this version to the 11th stereoscope.
Speaker 3That's me. I'm going to be editing this.
Speaker 2Yeah, you are the algorithm. Anthony is our homegrown algorithm.
Speaker 1He's our basilisk, he's an organic algorithm, if you will.
Speaker 2Anyway, so Apple has their own new app that they released, apple Immersive Video Utility, and this is sort of like iMovie, but for Apple Vision Pro. Yeah, so very similar functionality of those so it's like importing, organizing and reviewing your footage um that you take either with your Apple Vision Pro or, I wonder, can you take it video.
Speaker 3So does it only edit spatial video and not immersive?
Speaker 2No, it says it also does up to 8K 180. So it looks like it's a general VR 180 and spatial video editor. They can also connect directly to the Vision Pro to stream immersive videos. That's yeah, really. That was the major thing. Interesting, yeah, oh wow, with options for synchronized playback and multi-device viewing sessions. It's like a group. That's fascinating. Um, so is that like co-watching or exactly?
Speaker 1it's like. It's like the movie theater, but you're not even in the same room. Necessarily. It's synced. It's so different than what we're used to. Have you seen the last episode of So-and-So? Yeah, but I didn't see it the same time you did, whereas flashback 10, 15 years ago we would have to go to the bar to see the finale.
PSVR2 Eye Tracking on PC
Speaker 2Supposedly this is literally just for Apple. Immersive video and Spatial is included. Interesting, ivory has enabled eye tracking on PSVR 2 for PC. So in the last episode we talked about the price cut for the PSVR 2, which has sort of rebooted the PSVR 2's existence and this solidifies it. So one of the things we talked about last time was that the PSVR 2 was lacking features on PC. So it was still a good buy because there is very little else on the market that compares at a price point. But I was maligning the fact that it didn't have eye tracking, it didn't have HDR, it didn't have the haptic support, but now they've enabled both eye tracking and limited HDR support, though that one is with a major caveat.
Speaker 1But eye tracking does seem like the. That's a step into the playing field being like hey, hey, I learned how to lay up, is that okay?
Speaker 2That is a huge feature to have at this price point. So Ivory is a third-party dev who got this working through their own custom firmware which they sell. The driver support through steam and so you can buy this, the custom firmware.
Speaker 1So it's 399 399 cut by 150 bucks cut by 150, okay, yeah multiplied by the greek value of uh, the tariff divided by the exqualification of the bloviated goblendering, it's probably going to be.
Speaker 2So technical Sean, yes.
Speaker 1Shit. Pause the tariffs for 90 days, yeah.
Speaker 2So this implementation is custom firmware, so it's not going to work out of the box unless you get some extra juice in there, and it's not going to be supported by a lot of things because it hasn't existed thus far. But what this will enable is things like eye-tracked foveated rendering, which we have talked about previously yes, my favorite words. And so that's a dynamic foveated rendering. This this could also allow for um I tracking control of your interface, though there currently isn't any implementation of that in any of the the various interfaces on pc, though I'm guessing somebody will most likely hack that not hack, but like code that into SteamVR, because pretty much everything gets integrated into SteamVR.
Speaker 1And while they're at it, they'll try to work with the luminance levels and try to get HDR to fix with 10-bit, yeah, so that is another thing that is going to have to be added.
Speaker 2that is not currently supported is that they got the 10 bit color working for HDR, but currently SteamVR does not support custom luminance levels, so half of the implementation of the HDR is broken.
Speaker 1It's up to you to fix it.
Speaker 2Yeah, so literally somebody in the community is going to have to figure out how to implement or get Valve which is actually the more likely scenario to get Valve to support dynamic luminance levels in SteamVR.
Speaker 3They'll probably do it.
Speaker 2They most likely will, why not?
Speaker 3Just more eyes on their work.
Speaker 2Well, and here's the other thing, it's almost certainly going to be implemented into the Deckard in some capacity.
Speaker 1And it's all cross-play, it's Steam.
Speaker 3That's the whole point. They went either way. That's the beauty.
Speaker 2Yeah, and you know the Steam Deck has an OLED panel, so there is I don't know if it supports HDR, though Interesting, so Steam is gonna have to work this stuff out eventually, most likely for the Deckard. This is all speculation, so they'll probably end up putting it into SteamVR at some point, but we don't know. I hope Sony stays in their lane with this so it doesn't try to. So the only thing that's lacking thus far is the haptic support for the headset. I'm not sure if they've got it working on the controllers actually.
Speaker 1But it does work on the headset. I didn't know until recently. No, it does not work on the headset. Oh, it does not work on the headset, but I did not know that until recently that it even had haptics.
Speaker 2That the headset had haptics the headset, but I did not know that until recently that it even had haptics. Yeah well, and some people have said that because I I haven't played call of the mountain, um on psvr2, but in that it shakes your head when you get like hit. It sort of sounds gimmicky yeah, um, but some people said that it heavily increased immersion.
Speaker 3Yeah, I'm like be fair, everything sounds gimmicky. A lot of the stuff that we're talking about sounds gimmicky until you experience it.
Speaker 1I like hashtag gimmicky.
Speaker 3Gimmicky. It goes along with bloviated. We got a lot of mush mouths today. Bloviated hashtag, gimmicky.
Speaker 2Myself included, yeah. So it's interesting that this has been enabled because it sort of very much solidifies sort of the reboot and resurgence of the psvr2. We're going to see a lot of people purchasing the headset for pc and I mean, I think I'm going to, finally, because now that they've got eye track for video rendering, that's going to help with um performance on pc for certain games like Like. We actually didn't include this, but it was announced that Microsoft Flight Simulator is permanently adding directly integrated dynamic foveated rendering into the game. So now it at a at a low level it will support dynamic dynamic foveated rendering, which is great because microsoft flight simulator is ridiculously, uh, powerful in terms of, um.
Speaker 2it requires a lot of resources, a lot of resources, to play that game correctly, including Including a pilot's license, and so you really need to eke out every frame on that game, and adding dynamic boviated rendering will help with that quite a bit.
Speaker 3The question is will you be able to tolerate going back in time and dealing with Fresnel lenses? I mean, we always forget about that big caveat probably you know.
Speaker 2Like I will say, I booted up my psvr one a couple months ago and I used it for the first time in a really long time and I was surprised. Surprised at how good it looked. Still, um, mostly because when, when I play on pc, I've gotten so used to uh, streaming and uh, the streaming artifacts, the, the built-in compression right, it just has that look to it right right right.
Speaker 2Um, so you're used to flying blindfolded. I'm used to flying with like a little veneer of blockiness through all of, at least when I'm playing on PC, when I'm just like playing Quest 3 stuff. It looks great, but you also have a pilot's license. No, oh, it expired. It expired like 15 years ago. Fine, all right. Moving on, blackmagic finally fully revealed the Blackmagic Ursa Immersive at the NAB show, which is the National Association of Broadcasters. It's a large broadcasting convention, I mean like a lot of things like that.
Blackmagic Ursa Immersive Camera Revealed
Speaker 3It is.
Speaker 2It's a trade show.
Speaker 2Yeah, it's just another place that camera companies announce cool stuff and it's usually the big one for the year for cameras, yes, and so we've sort of gotten peaks and glimmers of information from the various influencers over time and still, to this day, actually that's the only place where we've seen video of the device. I was actually looking for some in-house videos from Blackmagic of the camera, and they have. Yet there are promotional stills which you're seeing behind you, but there aren't any official videos yet, which is frustrating, because I wanted to include some videos and I don't try to include third-party other influencers' content in our videos, because that would destroy the algorithm.
Speaker 1I don't know.
Speaker 2It just, but we do have some high-quality stills of the camera behind us.
Speaker 1And it looks really cute, if I must say it's a chunker.
Speaker 2I can tell you that.
Speaker 1How much is it?
Speaker 3$30,000.
Speaker 2$29,99. $95.
Speaker 3That's like three tariff anyway.
Speaker 1So we don't know exactly, yeah, how much is a camera that James Cameron would want to use in the year 2004?
Speaker 2Well, and we were talking about this earlier, is that a cinematic, like a professional cinematic camera, is $100,000.
Speaker 3Yeah, I mean, there's a reason you use the term camera package because there's, like always, these components. So what's impressive about this is how much you basically have a ready to shoot situation out the gate for 30k, whereas something like you know you could get a red raptor for 2025 somewhere in that range. Um, but then you've got to add batteries and lenses and adapters and wireless all these things media cards, all that stuff. This thing's got eight terabytes of storage built in. It's got 10g networking built in.
Speaker 2It's got the lenses built in and those and the eight terabytes, that's hot swappable right uh, I don't know if it actually don't.
Speaker 3I mean, you're not going to be recording, it's not, it's hot swappable, and that you don't have to shut the camera down to swap it out. I mean, that's what I mean, yeah but yeah, it's basically an array of m2 drives and. And they have a 16 terabyte option available?
Speaker 2Yeah, and we learned a couple months ago that you can work directly off of the drives.
Speaker 3The 10G networking that's in it means you plug it in, it's on your network. You can start pulling footage directly off of it. That's very impressive.
Speaker 1That's a big deal.
Speaker 3In real time. That's a big, big, big deal. We're using 10G, but I don't have SSD storage. So, it's still like when you hit 10G.
Speaker 2The bottleneck, unless you have SSDs, is the drives, which is kind of crazy, which we have learned, yeah, so you could like like theoretically plug this into like a RAID server.
Speaker 1You could plug it in and plug and play.
Speaker 3You don't need a RAID server.
Speaker 2It is its own RAID server.
Speaker 3You plug it into your Switch and you bring it up on your computer and you're editing, yeah.
Speaker 2So we'll just go through a little bit of the specs, because it's pretty impressive. So it's got dual 8K sensors. Yeah, it's got 8K stereoscopic recording at 90 90 frames per second, which is the highest that I've ever seen um 16 stops of dynamic range which incredible. Uh, and this is a. This is a big one and a lot of there's some some spiciness. Here is fixed focus pre-calibrated lenses. Yes, some people are not a fan of this, but working in VR video, this is almost required.
Speaker 3Yeah, I mean having a lens that we, the R5C lens here. The 5.2 mil has variable focus and it's more of a pain than it's worth yeah, and you're constantly on it.
Speaker 2Yeah, because it will drift away. You have to yeah for every single shoot. You have to calibrate.
Speaker 3Yeah, yeah, we're not every but I mean you're. It's like it's not very, it's kind of delicate. There's like some prisms in there, and so they they'll get out of whack and every now and then you got to tweak the lenses that the interfocal it's. Yeah, it's not great. So this this locked in. Uh, you know, my biggest like fear, or whatever, is like we don't really know what aperture they picked.
Speaker 2It's a fixed aperture as well, yeah, and they haven't said yet they won't say they're apparently like, literally like I.
Speaker 3I've seen a couple of videos I think hughes was one of them where they will ask over and over and then there'll be a jump cut and they'll be talking about something else?
Speaker 1Hey, don't ask about that. What do you think a good aperture would be? I mean, if I was to shoot in the dark.
Speaker 3Shot in the dark 5.6. That would be my guess. Somewhere between 4 and 5.6, where it's sharp enough because you need to have enough depth of field to have focus from. They're saying like one meter to infinity is what they say it is.
Speaker 2But they even said that you could get even closer.
Speaker 3Right, there's like some, just by observation. They're just claiming one meter, but by observation you can get closer than that, which again would lead me to believe that there's a pretty stopped down aperture which makes it really like. My biggest concern, or curiosity anyway, about this is how's it going to do in low light, because you're already?
Speaker 3running at 90 frames per second, which is way cutting down the amount of light you can take in. Then you have a fixed aperture. So you really the only thing you can do is play with shutter angle to get, and iso to get, a brighter image.
Speaker 1Hey, necessity is the mother of creativity.
Speaker 3Yeah, but I'm because I work with B-Raw a lot and they have 16 slots of dynamic range stuff like that. I think it's going to be fine. Clearly they wouldn't put out something that's just unusable in anything other than the brightest lights.
Speaker 1Yeah, magic knows what it's doing. I don't think that they're going to kind of go oh shit, pause the tariffs for 90 days. Yeah.
Speaker 2So it also has built-in ND filters. Clear two-stop, four-stop and eight-stop. A fold-out five-inch HDR touchscreen monitor. That's nice.
Speaker 3That's a launcher with the one on the assistant side, on the other side of the.
Speaker 2Oh yeah, it looks like a little robot with it's got stereo microphones and two XLR inputs, 12g SDI, ethernet and USB-C ports, high-speed Wi-Fi antennas. 8 terabyte that's the hot swappable M.2 SSDs that you're talking about. B-raw, and so this is a little bit. This part is almost certainly. Why the camera exists is that it has Apple Vision Pro integration, which is the camera system was designed with this in mind from the ground up, so it's designed to maximize Apple immersive video versus Cine Immersive offers lifelike 180 degree stereoscopic imaging. So what this means is that the camera was designed to work in tandem with Apple Vision Pro at a system level yeah, like at a yeah it's.
Speaker 3I mean it's. I'm trying to think of the best way to put it yeah, it's hard to describe because there's nothing really like it and there's a pipeline like da vinci. Resolve is part of this pipeline, yeah, um, but it is fascinating because, uh, the whole thing is metadata driven.
Speaker 2So yeah, you were telling me about this a couple days ago which is that like, rather than having everything flattened into one image, right A lot of the effects and the exposure, control. I wouldn't say exposure control but like specifically, the unwrapping is one of the things.
Speaker 3Okay, yeah, that's what you were saying. So, like the, you can feed an unwrapped version like the footage going into the avp via this pipeline will be the dual fisheye so it's not an equirectangular, it's not yet okay, so everything included. The unwrapping data is all coming from the calibration that was done at the factory. That is baked into the metadata of the B-Raw file. So the instructions for how to correctly unwrap the footage per the particular set of lenses in that particular camera is all metadata.
Speaker 3It's fed via metadata all the way to the avp, which then unwraps it.
Speaker 3Does that save you time, it saves a lot of time and it saves a lot of quality well, and I was also including like things like you can have an editing timeline with transitions and it just feeds both videos and the instructions for the point and which type of transition to the headset and the headset does well, and also this this this camera is also capable of live streaming, so some of this stuff you can do in real time so you could add effects.
Speaker 2I don't know about that, so I think there was some I watched some some narrative on this, which that there are going to be some effects that you're going to be able to enable in real time that happen in DaVinci, which will feed through to the camera, to be able to be live streamed.
Speaker 3Yeah, I mean that would be because you have all the 10G Ethernet feeds. You could, and you can, get a live feed of the camera via 10G. So I could definitely see that. See that using DaVinci Resolve would become like a switching hub basically.
Speaker 2Yeah Well and it makes sense because a lot of in the new that's why it's at the NAD DaVinci Resolve 20 beta they implemented a bunch of new live switching features into the new version of Resolve.
Speaker 1So more like party balloons coming at you, more like fun cat face emojis.
Speaker 3Yeah.
Speaker 2Well, and you were telling me the other day that that's sort of how Apple's camera stack works and this was interesting nuance that I think is really important. Yeah, and it's something I only recently learned about.
Speaker 3And this was interesting nuance that, I think, is really important. Yeah, and it's something I only recently learned about. When you make an edit on your iPhone, whether it's to a photo or a video, it is also just metadata until you share it with somebody who's not on Apple.
Speaker 2So until you flatten that image Right.
Speaker 3So that's why you can go back to a photo that you edited two years ago and just hit undo and revert back to the way it was, because none of the changes are destructive.
Speaker 2So it's until you export that video and send it to me, an Android user.
Speaker 3Then it's got to do a little bit of processing before it can flatten the video or the photo and send it out.
Speaker 1Yeah, which does also save time.
Speaker 3Yeah, I mean. The beauty is it's non-destructive, so you save time, but you're also not. Every time you do a conversion you're losing some quality. So by leaving it, pixel perfect all the way until the end user and the end user is unwrapping it, that's pretty mind-blowing.
Speaker 1This is like a chain of custody of edits.
Speaker 3Yeah, basically You're just sending instructions, which again is what the iPhone does with these things. It's just a set of instructions, but not actually doing like transforming the media, which just really goes to show how powerful the AVP itself is, that it can do all this stuff In a headset in real time.
Speaker 2It's better than Bitcoin. So it seems like a lot of this implementation is also DaVinci itself and that DaVinci has sort of which is interesting because up until about a couple years ago, there was no baked in VR support in DaVinci, Right Like they didn't even have an awareness of VR 180. We have to just treat it like a.
Speaker 3Yeah, I mean right now we're using tools that somebody else made Andrew Hazleton shout out yeah, the Cardiverse.
Speaker 2Yeah, the Cardiverse stuff, like using a lot of the tools he's developed that are just, you know, tools that somebody else made andrew hazleton shout out card at cardiverse.
Speaker 3yeah, the card PR stuff, like using a lot of the tools he's developed uh, that are just, you know, plugins that you download and install and you know they work, but like it's not nearly as efficient as app it had been written and like when, when I, when I first uh, because I made some um 3d titles that like flew at you right in fusion I literally had to just create two cameras and sync them and then align them.
Speaker 2But it was so hacky and if I had messed up one of those nodes the entire thing would have broken.
Speaker 3You were not pleased with your. Yeah, I wouldn't call it hacky, no, I was not.
Speaker 2I mean it just made it so much harder to be able to iterate because I'm not call it hacky no, I was not. I mean it just made it so much harder to be able to iterate because I'm not super familiar with Fusion. I was going to say, because it's not hacky.
Speaker 3It's actually like it's ironically the best way of doing it now that I've done it. Sure, yeah.
Speaker 2It felt hacky at the time.
Speaker 3Because it's truly it feels weird, but like it works better, like you better, like you're getting truer 3d by doing that, because you're literally setting the ipd and everything of those two camera lenses. Yeah, that's in your virtual environment so you can match your, the camera that you're shooting.
Speaker 2But now it seems, da vinci saw an opening in the market and was like, let's just be I mean, I don't know that, I think they just they're probably getting a there.
Speaker 3There's a apple partnership here at work, because I don't think they would do you think that they were approached by apple? It's possible. I'm not entirely sure, because apple has their own camera tech that they bought from john right, but they don't have the same kind of brand like loyalty, trust around the product, like vr black magic is.
Speaker 1Is it's a post-production like, yeah, playground and apple? I don't. I think that's just for enthusiast kids and their little siblings.
Speaker 3Well, it's an interesting move for Apple, because traditionally Apple is all about that vertical integration.
Speaker 2Well, and they have their own video editing software. They have Final Cut.
Speaker 3Right, but it's not as professional a tool anymore it used to be, but when they redid Final Cut, like 10 years ago or whatever, it was into something like when they redid a Final Cut like ten years ago, whatever it was into something like the X. Yeah, exactly, final Cut Pro 10. That like everybody's like. What is this iMovie? You know like a lot of folks I knew like they really created.
Speaker 1Adobe users out of. Yeah, it made you look really like in comparison to someone working on Adobe in the same editing suite. Literally you didn't know they're like is this wide rule?
Speaker 2FCP yeah, Wide rule is a great way to actually.
Speaker 1I'm not saying if you use wide rule, that it's. That's the point. It's like wide rule is just as good, but it looks like Apple kind of went too simple and it made folks who think that you need something incredibly complex to do something of quality. Look, I don't know what I'm doing, but this does that. It bridges the gap.
Speaker 3The big issue with that to be clear on the FCP thing was that they removed all the tools that professionals were using and dumbed it down. So it's not so much that they just repackaged it. They removed things that pros were using.
Speaker 2I was working on a project. It destroys your workflow right. You packaged it. They remove things that pros were using. I was working on a project. It destroys your workflow, right? You can't be a professional editor in that suite anymore. I was working on a project where we were using Final Cut Pro when X was released and we had a big discussion about well, we're not going to move to the next version of software because it destroyed all the tools that we're currently using.
Speaker 1You can't go back. Everyone has to move up, and not everyone wants the license.
Speaker 3Well, I mean you can hang on to the old one for a while, but eventually you have to. They'll just deprecate it and certainly now you're not going to have M chip support on the old version. It's like the Rapture.
Speaker 2You're in or?
Speaker 1you're out. But this is different. It's Blackmagic out.
Speaker 3But this is different. It's black magic, it's open source, so it is. It is interesting that they, like, looked at another company with better expertise in both camera and editing software and said y'all do it they probably realized that their internal teams couldn't do it I mean it's smart, right.
Speaker 3Instead of doing like, instead of reinventing the wheel yeah just get somebody that has the sensor tech and like these are. Basically, we say they're 8K per eye, which is what they're generating, but they're the new 12K sensors that and it's all on one frame or on one sensor that is, it's two sensors. It is two sensors that are genlocked together. It's two 12K sensors that are producing two 8K images.
Speaker 2I didn't realize it was.
Speaker 3They're just saying 8K because when you're looking at the fisheye version, you're throwing away a lot of the Ks.
Speaker 1Basically, yeah, what?
Speaker 330,000? That's what? So it's two full-frame 12K sensors. Is what?
Speaker 2behind it. It's like a cool motorcycle.
Speaker 3So it's got to in gen I'm using gen lock in more of a generic term.
Speaker 2I bet they're like tied together on the circuit level.
Speaker 3Yeah, so they're just because we have, there had been some, some, uh, the z cam.
Speaker 2The z cam was a nightmare, the z cam. Well, and also I'd heard, I heard. I heard some um back in the day hugh had talked about even with cameras that were gen locked, there was frame mismatch yeah, you'll end up.
Speaker 3You still end up with a frame. I mean, even we've seen several uh of the, even the newer, like sort of pocketable type cameras that have two, two cameras in them will drift one from the other on occasion. Um, it's one of those things that like, but again I think black magic doing what they're doing here have probably like. The k1 pro was basically two, two z cams just in the same body yeah genlocked, I think literally genlock technology. So they would drift and often and they also had a different.
Speaker 2Um, they had different. I noticed one time I think it was a bug in the software they had two different color temperatures. Yeah, completely different color temperatures.
Speaker 3Completely different color temperatures. This is what I was about to say. I've been in there, We've shot an entire project using the app Went in and set the color temperature in ISO and it only affected one of the lenses, and so we get the footage back and we're like, well, one of them's over here, one of them's over there.
Speaker 2Yeah, so we had to go in and match the footage.
Speaker 3No, we couldn't use it. We just couldn't use it.
Speaker 2Yeah, that's right, we had to throw that out. One was completely underexposed.
Speaker 3One was completely overexposed.
Speaker 1Do you think this means if we in the future, if our supporters support us enough, can get one of these and livestream one of these shows?
Speaker 3Yeah, I mean bandwidth. Yeah, huge bandwidth, absurd bandwidth. It has to be compressed. This pipeline won't work as it is live. It has to be encoded down into like AV1 or something Granted. With this I mean with DaVinci and the Apple pipeline. You use MVHEDC, which is even more efficient than AV1.
Speaker 1But someday when James Cameron.
Speaker 2I mean live streaming with these cameras would require significant infrastructure, like you would have to make sure that you're on a 10 gig network.
Speaker 3Yeah.
Speaker 2Minimum.
Speaker 3Yeah, I mean well, yeah, exactly. Or if you're trying to live stream to the internet, that's just not going to happen, so you would have to, so not in In this pipeline. It would be like could you encode it down and then keep the metadata pipeline, but then you're only serving to avps yeah, I'm not going to say it couldn't happen.
Speaker 2I think it could happen, but, um, it would require a lot of yeah, we're just not there yet yeah, a lot of tech that you would need to develop or find the people, because there are people who do these types of live streams.
Speaker 1We're just not South Korea yet.
Speaker 2They do exist. You just need to sort of vulture on them together into one place, and currently the industry doesn't have a lot of infrastructure for that. But there are people who are trying to make it happen. And there are people who are trying to make it happen and there are companies that are trying to make it happen, but none of them have really come to market yeah, yeah, people are claiming that they will.
Speaker 3Yeah, yeah, yeah. I keep saying like oh, we finally cracked it. Join the wait list though. Yeah, it's finally here, but you're gonna have to wait, yeah, uh, so I mean.
Speaker 2The truth of the matter is that this camera is most likely going to be the industry standard, probably for the next 10 years. I'm guessing.
Speaker 3I would guess because it's gonna take a while for, like, obviously, adoption to happen because it is so expensive, but it's not crazy expensive in terms of what you're getting caliber of what it is uh, and the caliber cameras, you know. So there's a lot of people complaining about how expensive it is. I'm like that's what it takes to get truly good footage though.
Speaker 2Right, I mean well, and this is this is not. This isn't even a prosumer camera. This is a professional camera, made for professional tools. So any here's the thing, though like to be able to even work off of this camera in post-production, you're going to need an extreme storage solution.
Speaker 3Yeah, I mean if, unless you just work directly off of the modules that are in it. So to that end, I forgot, we didn't mention this at eight terabytes, eight K per eye, 90 frames per second, using BRAW's 12 to 1 compression ratio.
Speaker 2Oh God, what is it?
Speaker 3It's an hour and like 23 minutes that you can record on eight terabytes. Oh my God, I mean.
Speaker 1That's insane. That's a lot how much is this.
Speaker 3It is sorry.
Speaker 2It's like one terabyte an hour. Right, Our product is just under.
Speaker 3Yeah, it's like 55 minutes, a terabyte basically.
Speaker 2And we're like a half hour to 40 minutes. So that is one-eighth, about one-eighth of the storage time.
Speaker 3It's the fact that, because we're doing 8K total. So if you double the resolution and then 50% more frames per second, good Lord.
Speaker 2Yeah, so Present more frames per second. Good Lord, yeah, so yeah, I mean, that's it we need 30 grand yeah. Well, we need a lot more than 30 grand. I can tell you that We'll start small. Yeah, I mean, our goal is to get a fleet of these at some point.
Speaker 3Yeah, got some ideas that would require at least three of them, yeah. More on that later, yeah, but yeah we Impressive tool that we're really stoked to play with. But there's not. You know, the features aren't even in DaVinci yet they're still in beta.
Speaker 2Well, I mean even DaVinci 20 beta is.
Speaker 3This is like pre-beta.
Speaker 2They're probably like handing you like an install that they've been working on. That's included For the people who get the camera.
Speaker 3Yeah, exactly, you probably get a download.
Speaker 2And they're probably like hey, you're our bug testers now.
Speaker 3Because they're still saying that they're working on even more features.
Speaker 2Well, and essentially how this rollout goes. Is they for anybody who bought this camera? They had to who are getting the units now? They had to tell them exactly the projects they were going to be shooting with them next. Right, because they want to control what that footage looks like, because it's going to be effectively their Main advertising.
Speaker 3Exactly.
Speaker 2The first impressions of the camera for the people that are using it right out of the gate.
Speaker 1It's always kind of been a first mover in this realm. Bug testing, I think, is a way to put it. But when you pay $30,000 for a product, don't you expect customer service to really?
Speaker 3Oh sure, yeah, and it's not even official. Yeah, and I mean there's going to be, and it's not even officially. Yeah Right, it's still just on pre-sale.
Episode Closing
Speaker 2Yeah, so it's. It's literally the first cameras are starting to get into people's hands right now. So it's, it's a uh there probably will be quite a bit of hand the wild at this point. I like the way we do it, yeah, so okay, that's it. We got to run. This is a short one today, but I think the Ursa Immersive stuff is huge, the Ivory stuff I'm pretty excited about. I think I will pick up a PSVR 2. Yeah, and that's it for Stereoscope, podcast number 13.
Podcasts we love
Check out these other fine podcasts recommended by us, not an algorithm.