
Denoised
When it comes to AI and the film industry, noise is everywhere. We cut through it.
Denoised is your twice-weekly deep dive into the most interesting and relevant topics in media, entertainment, and creative technology.
Hosted by Addy Ghani (Media Industry Analyst) and Joey Daoud (media producer and founder of VP Land), this podcast unpacks the latest trends shaping the industry—from Generative AI, Virtual Production, Hardware & Software innovations, Cloud workflows, Filmmaking, TV, and Hollywood industry news.
Each episode delivers a fast-paced, no-BS breakdown of the biggest developments, featuring insightful analysis, under-the-radar insights, and practical takeaways for filmmakers, content creators, and M&E professionals. Whether you’re pushing pixels in post, managing a production pipeline, or just trying to keep up with the future of storytelling, Denoised keeps you ahead of the curve.
New episodes every Tuesday and Friday.
Listen in, stay informed, and cut through the noise.
Produced by VP Land. Get the free VP Land newsletter in your inbox to stay on top of the latest news and tools in creative technology: https://ntm.link/l45xWQ
Denoised
Adobe's New Camera App, Jurassic World Shoots on Film, and AWE
Adobe launches Indigo, a new computational photography app, Jurassic World shoots on Kodak film, and Joey shares his hands-on experience with Snapchat Spectacles and more at AWE.
#############
The views and opinions expressed in this podcast are the personal views of the hosts and do not necessarily reflect the views or positions of their respective employers or organizations. This show is independently produced by VP Land without the use of any outside company resources, confidential information, or affiliations.
In this episode of Denoised, Adobe's new computational photography camera app, Indigo Film is back Jurassic World is shooting on Kodak, And what I saw at AWE, let's get into it. All right. What's up Addy? Hey, welcome back. How was your weekend? Yeah, pretty good. Pretty chill. Getting warmer here in Southern California, but I think, I don't think it was as bad as a lot of other spots in the rest of the country. We're in a bubble. We have no idea. And even between us, we're in two different bubbles of the, the ocean West side and the valley. Yeah. Oh yeah. Alright. So yeah, first up, uh, interesting update from Adobe. They released a new iPhone app called Project Indigo and it is a, what they're calling a computational photography. Camera app. So what does that mean? Is basically taking a bunch of photos when you hit the trigger button and combining all those photos and using computational computing and some AI behind the scenes to create a better image and also give you more options. So a better image, uh, what they're saying. Less of this smart. Phone camera app look, and then also a lot more options. It'll save it as a DNG file. You could either edit it on your phone or bring it into Lightroom. And then it has a couple AI features built in under the hood, like a ID Noising. So if you're shooting in low light, better low light photography. Also, if you want to zoom in more then beyond what your camera lens supports, it can, uh, enhance the quality there. Uh, and then also they have a built-in feature that's from Lightroom, uh, using AI called Remove Reflections. And so it's. It's designed just to remove reflections from windows. Uh, but it's built into the app, so also you can take the pictures and have the reflections, uh, removed if you have some in. Whatever you shop. Oh, that's very cool. Yeah. We generally associate Adobe nowadays with Firefly. Mm-hmm. And they've been very busy with the generative AI front. Of course, this solution's easing AI as well. But this is a hard pivot into where Google and Apple are really doing a lot of work in which is taking, you know, a smart four sensor, which is so tiny, right? Mm-hmm. It has its limitations because it's so small, you know, color performance is limited signal noise ratio is limited. Dynamic range is limited. And of course, not to mention that the, just the size of it, uh, actually gives you a significant disadvantage versus a larger sensor when it comes to depth of field and things like that. So in order to mitigate it, you throw a lot of clever software at it. And now in addition to the clever software, you're adding AI to it to get a far superior result. Yeah, and it also makes me wonder just where the trend of image processing is going and sort of some stuff we've seen before, especially with shooting video on an iPhone where sort of separating the process of like the image acquisition and then the processing and. How there's sort of more processing happening under the hood. And also even if you look at like Blackmagic's workflow where you have the camera, Blackmagic RAW and the sensor, and then a lot of other tools sort of built into resolve to process that image differently, whether it's noise reduction, stabilization, handling, the color, and. It seems to kind of continue with that trend of splitting up or finding different ways to merge or reprocess the image. That isn't just what you capture is what you get. Totally. Yeah. And this takes me back to just a few years ago, I don't know if you remember Lytro? Mm. Was that the, the device where it was like had like 20 different little lenses on the back of the camera? Exactly, yeah. So the goal was to take, uh, an image from. A small sensor like an iPhone, but a bunch of exposures at the same time, and then compute a stereoscopic 3D, you know, depth map, and then on top of that multiple exposures and on top of that multiple color profiles, and then combine that into something that you generally wouldn't get in a sensor. I remember it wasn't Lytro's big thing too was you could refocus. Where the, uh, what the point of focus was? Yes. That's because they just stored the depth map and they had inherent 3D information so they can go back and take stuff, uh, you know, out of focus or in focus. Uh, did you ever use one? I had a friend who bought one, and, uh, we had a lot of fun with it. It, it was shortlived because they were a little bit too expensive for the sensor at the time. And, um, I. Believe support and everything else was super limited as well. How long did it take to capture an image? Was it like a, a regular camera, you just hit the button? Or did you have to like, kind of hold it and let it do something? No, it was just like a regular camera. So I think it was, uh, doing something like 20 different exposures at the same time. Like a, there's a lot going on in that camera. Okay. What do you think would it would need to take for like just an iPhone? Like an iPhone Pro has three lenses, the lidar to, to. Do something similar to what that was doing. Yeah. So Apple is already doing a lot of the stuff under the hood for your native photo app, you know, and every year, you know, if you, you hear Marquez Brown Lee and a lot of the tech YouTubers talk about it, like, do we even need a Leica or a Canon 5D or these like professional gear when the iPhone is catching up so rapidly just using software algorithms. So Adobe's in that same lane now where. They believe that, you know, just with enough processing, you can actually mimic and perhaps even, you know, surpass the quality of an iPhone sensor. Yeah, and it's interesting too that they also, it's like there's a lot of paired controls in this with figuring that your, your workflow's gonna be, you're acquiring the image, taking the image with the Indigo app, and then doing more pro processing in Lightroom. On a computer with more processing power, uh, and kind of giving the, giving you the options there. And AI denoisers are super interesting here. We talked about AI denoisers before when it came to rendering, right? Like, you know, render passes are inherently noisy 'cause you don't have enough samples. And so denoisers can take out a lot of that grain, a lot of that noise, and leave you with a flawless result. iPhone sensors and, you know, small mobile phone sensors have the same issue because they're so small. They're just really grainy and noisy and especially when you zoom in and, uh, this Adobe Indigo project seems to be able to resolve that with the image examples that they show. Yeah, I was looking at what, uh, how it was defining Yeah. Natural look.'cause also its big thing was like it. Makes the images, uses AI to produce a natural SLR. Like, look for your photos, excluding special but gentle treatment of subjects and skies. Yeah. Well, did, do you know about the, the moon thing with Samsung phones? The moon thing? No, what's that? Yeah. So, you know, Samsung phones have a moon mode. Okay. So you can, uh, put it into moon mode, you know, point the camera at the moon and it'll give you a really nice, uh, image of the moon because it's, it just knows what's it's like trained on the moon. Okay. Yeah, well, not necessarily trained even. Maybe just like the signal processing and the exposure levels, because you're shooting it at night and it knows roughly what the moon's dynamic range is. You know, whatever. Turns out that Samsung was actually pasting a picture of the moon on top the moon. So it was totally faking the result. You didn't know about this? Uh, this sounds vaguely familiar, but No, I don't, I did not remember this. Yeah, so I, I mean, I just like, when it comes to photography, you still have to process what's coming in. You can't just like throw stuff on top of it and completely make it something else, right? Apparently not. Unless you're Samsung. I was thinking before, I was thinking like, wow, that's kind of clever because. The moon is one of those things where. You see, you know, you see a really beautiful, you know, moon moonrise and you're like, man, I wanna capture that. And then you take the picture and no matter how you take the iPhone picture, it always looks like some just weird blo blob in the sky or some bright dot, like you could never capture it. It never captures the scale. It never captures the detail. I was like, oh, that's clever to make a moon mode. But um, yeah, not blending it with an existing image of the moon. So that's what I'm saying, like, you know, we have to be careful with, uh, processing that you can totally overprocess into fiction into that category. Yeah, I mean, I was thinking overprocessing of where you sort of just get like more of a, an uncanny valley where you're like, it's not making stuff up, but it's just like too processed and it looks weird. But yeah, that's just straight up fake. So. There you go. The other thing it said too, was it, it, it would include technology previews. Um, and I'm wondering if they're gonna test out their, uh, content authenticity program, which was the, uh, uh, partnership and program that they talked about a while ago to. Embed some sort of watermark in images to verify that like these images were genuinely taken, that they're real, uh, and that they're not AI generated. Um, so I'm curious there if that's where they're gonna beta test or roll that out. This is also, uh, uh, from a business standpoint, it feels like, you know, Adobe is seeing a big shift to, uh, Lightroom users doing more stuff with mobile phones rather than professional cameras. So in order for them to capture that market. Tie it back into all of the creative suite stuff that they have, white room, Photoshop, you know, you name it. This is like a clever play that kind of extends the ecosystem into iOS, into Android, in order to grab that data back into it. Mm-hmm. Yeah, yeah, yeah. It's a good ecosystem. They definitely, well, I mean, they're already doing pretty well, but I remember they kind of filled that gap too, especially when Apple killed. Aperture, their photo editing organizing tool that a lot of people liked. But yeah, they, um, it was in their pro bucket and that got the X around, I think around the same time when final cut shifted to X. I see. Uh, yeah. Yeah. This was a while ago, but yeah, Lightroom's kind of been the main game in town. It's also interesting, they still keep around Lightroom Classic, which was their original version. So there's two versions of Lightroom. The original one and then the newer one that's more cloud integrated with the creative cloud. Yeah. I'm guessing all these features are only for the current version, not classic. I think they kept classic around for people that didn't wanna change. Yeah. They want something local and, uh, yeah. Yeah. I mean the, the new, the newer one still works with local stuff. Yeah. I don't know. I use Lightroom all the time, and I have a couple of professional photographer friends who rely on it to make their living, so, oh, yeah. I mean, yeah, if you're doing photos synonymous with photos, yeah. If you're doing photos day in, day out, it's, yeah. It's the best one out there. Yeah, for sure. Yeah, because also you could do so much in there, you don't need to go to Photoshop. Like, you could just do it in Lightroom. All right. Yeah. And then, uh, on the opposite end of, um. AI photo taking. Got a, got another film story. Yeah. It's funny that we cover both AI and film.'cause those are the two complete ends of the media production spectrum. Yeah. So I ran across, and so now I'm a subscriber of Kodak on YouTube. Could Kodak's have a YouTube moment after the, uh, sinners video? For sure. Yeah. And, uh, they just, uh, released a wonderful video that we'll link to here about Jurassic World being shot on Kodak film. Mm-hmm. And they have Gareth Edwards, the director on it, the cinematographer on it, and they all talk about sort of the benefits of film and why, you know, it, it, they talk to us like we. Have no idea that film existed. Like, did you know, did you know, uh, more Satran Greens did, you know, you can like film something and then when you play the dailies back it looks good. Yeah. If you do it right. So that's, that's exactly what they said in the video is like when you're playing the dailies, you realize how beautiful the imagery that you shot, you know, versus opposed to like the decent cinema cameras, which have to be graded, you know, in days. Before a director or a DP sees it. And that generally probably doesn't happen. I, yeah, I mean, my dailies experience with stuff shot on film is really kind of limited to film school, so they never looked really that great. And I'm wondering, I mean, it also, it's not like color grading is a new thing just for digital, like this, um, existed for. Film as well. So I'm also wondering like how accurate that is. Yeah. And that, that, that film dailies look so great outta the box too, compared to film. I don't know. It's almost like they got paid to say this stuff. The VFX suit for Jurassic World was also on the video and you know, he talked about how careful and considerate they have to be to put digital elements on top of the film plates because mm-hmm. You know, film plates are gonna have natural chromatic aberration. Mm-hmm. Heavy vignette around the edges. Um, all, you know, anamorphic aspect ratios and stuff like that, and the most obvious one being film grain. Film Yeah. Sort of texture. And the VFX has to land on, on, on top of that and blend right in. Which, which is quite the challenge. Yeah. I mean, how much, obviously Jurassic World's a very VFX heavy film, so how much more complicated does it make things shooting on film to composite and, and like versus shooting on digital. Yeah. I think the one thing that they're not really talking about in any of these Kodak videos is the film stock scan to an existing VFX pipeline, right? Like mm-hmm. You still have to go frame by frame, scan all that stuff, get it into something like XR format, and then that goes into your VFX pipeline, like. That takes a lot of time, that takes a lot of money. And they're not sort of talking about like the added cost of all of this. The film lab, the film scans on top of what is already an expensive VFX process. Yeah, I mean, I'm curious too if, uh, they're, if they're gonna do any film, prints going back out, like centers. That was a process too.'cause I mean, they shot on 70 millimeter but they were also gonna project a 70 millimeter. But they still had to shoot, scan digitally, acquire everything. Edit effects and then print it back out. It's a film for the limited IMAX runs, but then also the other variety of 70 millimeter prints and stuff they did. So I'm curious if they're doing anything, uh, similar with, uh, Jurassic World. Yeah, and if you just look at the two movies, sins being, uh, a far simpler movie on the VFX side. I mean, it's still very VFX heavy. You know, Michael Roll was the VFX soup and um, you know, you had two of the main characters, they were twins. So there's a lot of VFX associated with just putting Yeah. Two of the same characters on screen, obviously the vampires and everything. But then you look at Jurassic World and it's like, almost like every shot needs few V effects because you have dinosaurs in All right. Full CG characters. Yeah. Right. So I think it's a much heavier lift and it, I. Honestly, it didn't make sense Frame.io to shoot a movie like Jurassic World in film, but here they are, they did it. Mm-hmm. So let's see what the results are. Also, it is really funny and ironic that this is Gareth Edwards who shot the creator, which everyone loves to talk about how the creator was shot on the Sony FX3. Yeah. Pro Sumer, uh, camera. And yeah. Now they've gone complete opposite, uh, not only shot on film, but also shot on Panavision cameras, which were featured in the behind the scenes video? Yeah. I mean, Jurassic World is a tent pole franchise for NBC Universal. So they're gonna spend everything they have at this movie, and this is obviously gonna be a massive hit regardless of how well it plays out. You know, people are still gonna flock to the theaters because of the mm-hmm. IP and the weight of the ip. Yeah. I'm guessing that also like, is a, uh, you know, the, the film. Kind of tries to throw back and capture the nostalgia of the first movie. You know, obviously shot on film. Yeah. Back in the nineties. Yeah, it in the nineties. 97 I think. 97, yeah. Yeah. 96, 97. That sounds about right. No, no. I'm sorry. It's before that. 94, I believe. Oh, okay. Yeah. Yeah, we'll go for it. But I'll tell you what, Joey, like shooting on film is not gonna save you from a bad movie, right? Like if No, of course not. No. That's the part that I think we need to remember out of all this. I mean, Jurassic three was probably shot on film. Not a great, not a, not a great dress park. Yeah. But it's encouraging to see a shift into film. I think. You know, we talked about VistaVision and IMAX and now Kodak. Mm-hmm. Um, I think you mentioned that Leica is also introducing new film stock. Yeah, I did see that too. Yeah. Leica, the camera manufacturer is introducing their own film stock targeted at still photography, but a 35 millimeter film stock. I believe it's an 800 t uh, film stock. And. I cannot remember offhand, but I remember seeing a couple other companies, maybe not getting into the film game, but introducing new film stocks as well. It wasn't, but it wasn't, it wasn't Kodak, maybe it was Fuji. I don't think it was Fuji. It wasn't one of the main companies. It was like another company mm-hmm. Was thrown into the, uh, the, the 35 millimeter film game. Um, yeah, I mean, I think that probably also ties more into the consumer interest and Gen Z interest in, you know, everything is cyclical, so like, just. Looping back and be like, oh hey. Like you can take a photo and you don't have to see it right away. And you can have this magic of like waiting to see what it looks like and it already like, looks funky and cool. Yeah. Uh, you know, that comes full circle. I. Yeah. You know what, what else? Peaked as well as film technology is, uh, space, uh, space rocket. So rocket technology and film, they both peaked in the sixties and seventies, right? Mm-hmm. I don't think from a pure chemistry standpoint, film technology is any better today than it was, you know, 50 years ago. Yeah. I mean. Just because we stopped, like innovating on it or just like it is what I think it's sort of just peak. It is. Um, so I'll be, did, I'll be surprised to know if like, if there is any film, uh, stuck out there that has even higher, you know, dynamic range or higher color performance than that one from 2030. I dunno the exact dates, but I mean, we did get faster film stocks. I mean, compared to the sixties. Like mean we got like 500 T and we got. Stocks that can shoot pretty well in different lighting environments and Right. I wanna say those were developed in the eighties. I am blanking on his name, but he was on, in the old version of the podcast, he's a professor at RIT and was also involved in Synapse Studio, or No, he was, uh, he's part of the RIT Virtual production program. Oh, is it David Long? Yeah, I think so. Yes. Yeah, he was, he used to work at Kodak Development and was worked on the team, but did the 500 T Yeah. Um, and the color science behind that. Awesome. Uh, so I'll say that. Yeah. That's the, the only innovation that stands out in between now and the sixties. Sure. Yeah. And, and, and, and film innovation. You know, if it helps sort of bring the movie goer back into the theater because of, you know, a film to film projection and just like a natural intrinsic boost and quality in the cinematic movie going experience, then I'm all for it. Yeah. I mean, I think tiny portion of audience that won't care Yeah. And will get them back in the theater because they was shot on film is, uh, is, is very minuscule. Um, but if it looks cool, yeah. I mean also if it like, you know, looks more vibrant and colorful.'cause I, I feel like that is something that keeps popping up on like Twitter and online discourse and stuff of just like, why do films look flatter and less saturated today than they did 20, 30, 40 years ago. Yeah. I think we brought this up before right, though, because vx, your point was, was VFX? Yeah. You have to, you have to light conservatively flat and then, uh, typically the relighting is supposed to happen on VFX and you're supposed to add all the contours and the contrast back in a lot of times because of iterations and deadlines, it doesn't happen. Hmm. Um, so you just kind of ship what you have. Um, it's unfortunate because, you know, even though the acquisition technology is far superior, like we can get more dynamic range outta our cameras, we can get far superior resolution. The actual, uh, artistic quality has sort of suffered a little bit because of our heavy reliance on doing everything in post. Yeah. Not being able to see there is a value of being able to just kind of see what you get as you're filming it. Absolutely. All right. And then, uh, last story. Uh, so the other week I went down to Long Beach and checked out AWE the Augmented World Expo, which is sort of built as the largest augmented and virtual reality conference. A bunch of companies and panels and talks and stuff around, uh, what's happening with the future of. AR and VR this year, sort of the big company that was everywhere was Snap Snapchat Nice. Uh, with their Snapchat glasses, which I think have an actual official name that I'm blanking on. Yeah, so the Snap, the Spectacles. Uh, so I did get the try 'em on and these were sort of the first, this is the first time I did try them. So the current version, the, and you might have seen them look like kind of funky, like old school. Thick 3D glasses that you might have gotten into, into, yeah, they're chunky. 3D movie. Yeah. Yeah. And so they're, they're saying like, the current version that's out was more targeted for developers, for people, for developers to build and figure out and develop apps for it. So that next year, which they did announce, when they officially start rolling out the new version, which is gonna be targeted at consumers, they'll be a little bit lighter, uh, more streamlined. The version or what I gotta play around with it. Felt like very smooth and like, kind of like a low five version of the Apple Vision Pro, but obviously like way more comfortable and augmented reality. So like you can see the real world outside of it and then it has a display inside that overlays. It can map out surfaces and stuff so it can project onto the real world, which is sort of like what augmented reality, uh, should be doing, right? And it can also track your hands, and so your hands, you can pinch and point and flip up your palm and pull up menus with your palm. But I mean, the nice thing is like it was pretty comfortable to wear. It, uh, you did have to like measure your face with an app and then they would give you spec. Oh, the inter distance, I think. Yeah. Were sort of fit to your face. Yep. Uh, but no cables like the battery. Everything was self-contained. I dunno what the battery life is, but it's just a, it's just a glasses you put on your face. So like, sort of the ideal of what AR needs to be to kind of work. Well, glasses that fit on your face. The projection screen. Projection screen, I will say. It was sort of a bit more of a narrow field, field of view. So whatever it was projecting was maybe like in my center field of view, like a big TV screen, but if the video game or whatever was happening, stuff started moving out, like towards my peripheral, I would've to like move my head to keep seeing what was happening. It was a bit narrower field of view then if you're Yeah. Then if you're used to like an Operation Pro or something, like an immersive headset where you can kind of see a full 180. Right. But yeah, a lot of the stuff they had us test was like games and things and it, it just, the tracking works really well. The, uh, you know, you could. Stuff is kind of planted in the physical world and as you move and walk around with your head, it stays planted and then you can kind of see, uh, or walk around it. The hand tracking worked really well and yeah, just pretty, pretty smooth and impressive experience for, uh, uh, early AR headset. Did they showcase any killer apps or it was just mostly games? It was mostly games, yeah. I wouldn't say there was any like killer app display. How about Snapchat, the app platform? Was that integrated into the classes in any way? Uh oh, like the actual, mm-hmm. App, not that they demoed, not that they really talked about. A lot of the demo was games and they like give you the headset and they have a guy walk you through like what you should pull up. And it was pretty much, it was games. Yeah. Yeah. I mean I, um, yeah I think there are some cameras integrated where you can like, similar to the Ray-Ban, uh, the Meta Ray-Ban glasses take photos or videos, but that wasn't demoed probably'cause you'd have to link it to your account or something and it would be too complicated for a test run. But yeah, I mean definitely I feel like that also. It ties into Snap in general, where it's a bit more fun and games and not like what Meta's been trying to do, or Apple of like being a. Work device. Sure. Yeah. It, it's, but I mean, the potential's there. Yeah. Every, every major tech company has sort of one tentacle in the wearables, uh, augmented wearables game. Right. And, uh, you know, you have obviously a VP, you have meta with their AR glasses. You have snaps. Been working on this for a long time, I think 10 plus years now. Mm-hmm. And it always feels like there, it's still just like a couple years away every year. You know, it all comes down to miniaturization. And the fact is that I'll say, you know, perhaps all these glasses are powered by the Qualcomm Snapdragon chip. Mm-hmm. So that's like, um, system. On chip. So basically G-P-U-C-P memory, everything in one single chip. Okay. And, uh, every year that Qualcomm introduces a new Snapdragon, I think we're at Snapdragon seven or something like that. You'll see a significant level up in all of the wearables, Uhhuh. So like there's, that's just what they're able to do, dependency on what Qualcomm can put onto silicon. Then that dictates what these big tech companies can do with their devices. Yeah. And how much power they take, or if they're more power efficient. Yeah. So they can run more on a smaller battery built into the headset. Exactly. Yeah. It's just, there's a really difficult problem to solve because you know, you're weight constrained, you're battery constrained, you're performance constrained, and yet as consumers, we expect stuff to be as good as like, you know, PlayStation five graphics or something like that. Right. Yeah. Yeah. It's like, yeah, gimme the Apple Vision Pro experience, but uh, with a pair of glasses. Yeah, exactly. It's, it's a hard problem to solve for sure. But I mean, yeah, it's interesting to see how it's getting there and I feel like Snap, you know, at least Frame.io'cause I don't use the actual app. Like, I forget where it's like, yeah, they were one of the original AR filters, like really good at tracking faces, really good at changing. How people look, sort of the original deep fake ish in a way of just real time tracking face and changing it to something else. Yeah. A good friend of mine was on the SNAP team for a long time and they had computer vision scientists and researchers and you know, even AI researchers doing stuff. I. Long before this stuff was cooled. So they have mm-hmm. Continued to invest in feature forward technology. So I, I think a lot of the stuff that Snap is working on now, we, we won't see for a couple more years. Yeah. But yeah, I mean, it'll, it'd be interesting to see what happens next year when, like, now these glasses roll out targeted at consumers and you know, what adoption's like and what, you know, if there's, if there are any killer apps or killer features or if it's, um, sort of in the same. Early adopter bucket is like the Met Ray bands. You know, where it's like just true, true early adopters get it and. You know, maybe it sort of hovers there for another few years. Yeah. Speaking of Meta, I think you mentioned that they now have a browser for spatial computing. No, it is not meta. It is a meta metaverse browser. Okay. Yes. Explain that. See, look, meta meta changed. Meta changed the word, and now you're just thinking everything metaverse is meta. That was the intent. And they succeeded. Yeah. So there was another interesting company I saw. So they're called RP one, and so they built what they're calling the first metaverse browser for the spatial internet and. They have a lot of ideas going on here, but it was interesting. The main one is something that we've talked about before with when we're talking about Roblox. We're talking about Fortnite and these ideas of the Metaverse, but they're all sort of these self-contained metaverses, which sort of does not really metaverse because that the whole point of the metaverse is to be like a complete open. Virtual universe that like everything kind of, we talked about this. Yeah. Talk and connect to, we're still far away from that. Yeah. So they're trying, so RP one is trying to kind of be the first step into that. So they built a metaverse browser and there's sort of two levels to it of like how they're interpreting it. So one is mapping earth or mapping the real world one-to-one to this virtual metaverse world. So replicating whatever is on. Planet Earth, uh, to a metaverse, uh, equivalent. So like, if it is a, the idea of this is to tie it into AR technology. So like if there was a Costco or Walmart in the real world, it's tied one-to-one to the, the Metaverse version on their platform. And the company could, you know, claim the location and then add inventory or add deals or something. So if you had your AR glasses and then your. Synced up with their spatial metaverse. You go into the real world and then it can overlay, you know, Hey, where's the cleaning supplies? Or whatever, and it like has this one-to-one understanding of like, you're in the real world in this metaverse world. Then the other version is just like you go off world or off. Earth and then that's the digital metaverse and sort of like the Oasis, you know, from radio player one where you could just build whatever you want to claim whatever you want. Infinite space, infinite size. That's the digital metaverse. That's sort of what I remember from their talking to them in a nutshell. But it's all sort of browser based, so you know, it's an app that you can access on any device right now, so it's not really device dependent. That's sort of the idea behind it. So it's a more open platform. You also don't need a headset. You could access it on your phone or web browser and. Know, you just have to navigate with, with traditional keyboard. So yeah, it's cool to see, you know, they're trying to just be, be the first to like just build like a open platform. And one of these problems and ideas that we had talked about with the metaverse of just being this device platform agnostic. Literal metaverse. Um, so yeah. Cool to see, um, them building that and, and, and, and what they're launching. Curious to see. Yeah. It feels like every year there's like some incremental sort of step into us getting closer and closer to the metaverse. And, you know, not to say that I. You know, people are not on the metaverse already. If you're on Fortnite, if you're on Roblox, you know? Mm-hmm. You're certainly on the Metaverse already. We're still waiting for like the general public, the rest of us to get on for a specific need and a specific reason. Yeah. There needs to be a reason Yeah. To do that. And I think also going back to like where we're just talking about with Snap and the spectacles, you know, uh, there's a hardware limitation too, so it's like, yeah, that idea I just kind of described of going into a Walmart and your glasses can tell you. Where the cleaning supplies or where the bedding is or where the food is, you know, and paint arrows on the floor, like until, you know, the hardware is more, uh, adopted and easier to use. That's, you know, that's like a hard sell for the benefit.'cause like the hardware's not there yet to actually support doing that. For sure. Um, I mean even this reminds me even at the, uh, convention, they did have a, I forgot the company name, but they, they were demoing there, but then also around the entire convention floor, they had these QR code stickers all over the floor. And so it was like navigate to wherever, like convention map or navigate to where you want to go. And so you could scan it with your phone, but then it was like an app had to load up and they had to like install the app. And then it did eventually, once it worked, uh, it would paint out Arrow. You'd be like, I wanna go to this booth. And then based on that QR code, it knew where you were and then it would. Draw out arrows throughout the convention floor and you know, navigate you with your iPhone camera to the booth that you're going to. So that's like a rudimentary version of, took a lot of steps to do this and I probably could have figured it out faster just looking at a map. But yeah, if it was on your glasses, if it was like in my glasses on my face and then it just kind of can figure out where I'm at and I'm just like, yeah, I need to go to this booth, and then it just starts drawing arrows. That would be the useful. Beneficial future that I think more people would, yeah, it goes back to like find, find utility in how you get entertainment and value out of it. You know, if you combine that with like Pokemon Go or something like that, then perhaps there, there's a way to just get a ton of people to use stuff like this. I think Pokemon Go is where society peaked. So we all came together to look for Pokemons and we're having fun. And that that was like, that was a good moment. That was good. One highlight of, yeah. Uh, yeah. And then, um, yeah, that was, those were the two standouts. Uh, I, I am bummed 'cause last year they had, I went to a w for the first time last year and they had a history of. Ar VR gallery, which was sort of like every headset, like ever that they had sort of collected and was like a walkthrough that you could, um, check out and explore. And it was fun to see that. Uh, they did not have that this year, so I was kind of bummed. Oh, that didn't that again, because it was, um, ginormous and clunky. Yeah, like that the original. Yeah, that super clunky one from the sixties. They had like the original, that Nintendo Red headset, uh, from like Super Nintendo that integrated with it. They had a giant one that was from Disney Quest, which was like Disney's early attempt at like a interactive arcade. Thing. It was at downtown Disney and Disney World in like the nineties. So like, I'm curious, everyone else remembers that. It was cool ish, but they had an Aladdin Magic carpet ride that was an early version of like virtual reality. And the headset for that was like, it was like this big Yeah, you said it, it had like a, a tether on it. I think they suspended it from the ceiling to take the weight off your head. Um, and they would just pull it down and put it on you. Oh, that sounds like a motion sickness. Nightmare. Yeah, probably. Yeah. I get really awful emotion sickness in these things. Oh yeah. I did see one demo where I posted a short video of it where it was like a, a bungee jump simulator, and so you see the person on a platform with the headset on them, and then it just flips 'em all the way forward, and I'm like, oh dear. No, not you barking. I tried a driving simulator on like the meta when I first got it, and I was like, I felt so sick for like an hour after that. Yeah, I could not, I can't do motion, like, motion on the simulator without me moving. I, I can't do it. The only thing that sort of remedies that is something like the positron chair. Do you know what that is? I. Is that like a motion simulator chair? Yeah, it's, it's, or something. It's shaped like an egg. It's really fancy. And, um, they're a company here in LA that make it so if your VR gamer, your VR cinema has a motion track, it can integrate to that motion track. That's cool. Yeah. That, that makes sense. Yeah. Like I'm, I'm usually fine on like, the simulator rides, like a, you know, star tours or those kind of rides.'cause you're, yeah, the thing moves but helps your, your body. Your body matches the movement. Um, we really can't. Yeah. But just, we really can't. Go past what our physical limitations are as far as, you know, motion sickness and some of these things go. And I think there's a big reason why this stuff still hasn't caught on is for the majority of us, it makes us a little bit queasy. It's an issue. Yeah. Even moving around some of these games, uh, they have like the motion sickness mode where it just sort of fades the screen out and like jumps you forward and then fades it back in. Oh yeah. So you're not like actually moving forward. Yeah, yeah, yeah. Uh, yeah. So they're trying to come up with some solutions, but. Yeah, that does not help a lot. I still can't believe there're trying for wide adoption there of, uh, drone pilots, uh, that put the headset on and just flying at like, oh my God, 50 miles per hour. Hour. God, I see. I don't know how they do it. Yeah, I dunno how they do that either. Yeah, I see those clips and they're just sitting there and they're like things flying around like, whoa, I'm sick. I'm sick. Looking at the video of them flying, like I dunno how they do it, but I'm glad some people exist where they can do that and not feel I. Like they wanna puke. Send them space. They're astronauts. Yeah, yeah, yeah. Fighter pilots in the making. All right. Cool. Yeah. Can think. Good place to wrap it up. Yeah. Link for everything we talked about, especially the video of Jurassic World and the Kodak video. Um, it's gonna be@denopodcast.com. Yeah, we got a lot of love from our last YouTube video. Um, some really nice comments. So thank you. And, uh, we'd love to hear from you. Uh, all right, everyone, we'll see you in the next episode.