Denoised
When it comes to AI and the film industry, noise is everywhere. We cut through it.
Denoised is your twice-weekly deep dive into the most interesting and relevant topics in media, entertainment, and creative technology.
Hosted by Addy Ghani (Media Industry Analyst) and Joey Daoud (media producer and founder of VP Land), this podcast unpacks the latest trends shaping the industry—from Generative AI, Virtual Production, Hardware & Software innovations, Cloud workflows, Filmmaking, TV, and Hollywood industry news.
Each episode delivers a fast-paced, no-BS breakdown of the biggest developments, featuring insightful analysis, under-the-radar insights, and practical takeaways for filmmakers, content creators, and M&E professionals. Whether you’re pushing pixels in post, managing a production pipeline, or just trying to keep up with the future of storytelling, Denoised keeps you ahead of the curve.
New episodes every Tuesday and Friday.
Listen in, stay informed, and cut through the noise.
Produced by VP Land. Get the free VP Land newsletter in your inbox to stay on top of the latest news and tools in creative technology: https://ntm.link/l45xWQ
Denoised
Netflix's VOID Model Removes Objects and Fixes Physics
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Netflix just open-sourced an AI model that removes objects from video and corrects the physics — and that's only one of several AI updates worth tracking this week. Addy and Joey break down Netflix's VOID model, early leaks of what looks like GPT-Image-2, the underwhelming public rollout of Seedance 2.0 in the US, Google's Gemma 4 open-source local model, and the honest, unfiltered experience of running AI agents for real-world scheduling and production workflows. Plus: Artemis moon mission camera nerding.
--
The views and opinions expressed in this podcast are the personal views of the hosts and do not necessarily reflect the views or positions of their respective employers or organizations. This show is independently produced by VP Land without the use of any outside company resources, confidential information, or affiliations.
Boomers are really cooked now. It's so hard to tell. It's so hard to tell what's real and fake. And if you think you can't, I think you are. Overrating your abilities Upright, welcome back, Jude De No. Addy, how you doing? I'm doing good. How are you Joey? I'm good man. Did you watch any of the uh, Artemis going around the moon on Monday? I'm a nerd, so yeah, I've been following all that stuff. Stuff. I had it on all day. Okay, you do? Alright. And then when they kept talking about like how they trained to photograph the moon and they did like simulations. And also that they kept talking about GoPro and I kept thinking like, this is like the best advertising GoPro has had for a while. And Nikon too, like they're like the and main commander was like, I only use Nikon Z nine. I was like, that's very specific. Yeah. The making camera was the D five, which at first when they met they kept saying the D five. And I was like, are they mixing up five D? Is it Canon? And then I was like, no, it's the D five was like, no, D five's their old school. 10 old camera. Yeah. Yeah. The Zs are the mirrorless. Yes. And they did bring a Z and then apparently, well, I'm thinking because it takes forever for stuff to get like space certified. So it was probably like the D five was already certified and really good at low light. That's right. But that they said that they were to convince them to bring the newer cameras as well. The whole point of a spacecraft is to make things. Habitable inside of a space. Like can't you just bring anything? Am I too simple minded to understand that there was some story of of, I don't dunno if it was like the original NASA days, I also dunno how true the story was, but that it was kind of a joke where it was like NASA spent $50,000 to develop a pen that could write pressurize. So they could write upside down. Upside down and in space. Yeah, and like underwater. And then the joke was like the Russian should use a pencil. Yeah. And, and that was a joke. But then there was clarity where it was like you can't actually use a pencil in space because it like leaks graphite. And then the graphite particles will like float around and like get into stuff. It'll short circuit the switchboards. Yeah. It was like a major. Yeah. So that kind of story is why. Stuff has to be vetted to bring into space, is my guess. Yeah. I just think they overthink it. Just send that up. Just take a GoPro. Just take, why don't they take a, it's the 360. Take an iPhone. They did have iPhones also enough. Yeah, they, you're right. Another good iPhone commercial. But I thought it was funny 'cause the thing they kept dealing with on the radio was they kept having glare in the window. They were like, we need to put shirts on the window. I was thinking like they're like those little hood things for 15 bucks that like suction clipp to the window and you put the lens in it. Oh, a polarizer filter. Hello. Hides the glare. I mean, I'm sure these things already have polarizers on them, but I was like, oh, glare, glares the biggest issue and. Out of all the stuff they thought about and they couldn't get the little hood thing, but the shots came back and they look amazing. So they figured it out. They look amazing. Yeah. And uh, they, they, I was watching a video on YouTube where they were comparing the, the blue marble from 1972 with today's blue marble. Yeah. And the internet's going crazy'cause today's blue marble looks more polluted and yellow and gunky. I saw one that was more commentary on like. Comparing that to how the film frames and how like everything is just like less saturated and flatter today. It's like the Netflix look. Yeah. Un Artemis. Yes. Lemme try to find this image. Yeah. I mean, I, my thought was the. Image we were looking at from Apollo that was shot on film, brought back to Earth, processed, right? The one we're looking at instantly is the one that people were sharing that was an iPhone photo transmitted over their. Um, wireless signal, so I'm guessing compressed. Sure, sure. I think the dynamic range would hurt there for sure. Um, but like the Nikon images that came back, shout out to Nikon. I mean, those are crisp. So the reason that the blue marble looked gunky and yellow is because it was a nighttime exposure. Yeah. What you're looking at there, this, so the one on the right I think is the new one, and the left is Yes. 1972. Yes.'cause so the reason the one that looks a little flat and gray is because that's a, that's a night time exposure. Like that's the earth at night. So you're saying they in the shadow of the Oh, because I could see the sun peeking out on the other side of the Right. Exactly. So yeah, so like that's how good our cameras have gotten, that we could go to like 57,000 ISO and still get this amazing image. Wait, that's crazy. How is it so bright? If that was nice. If this was in the shadow. Right. How is it? So that's how good those Nikon Z nines are. How is it so blue? Oh, so this was a Nikon. I thought this was an one of the iPhone 15. This is a quarter second exposure and yeah, IM such a nerd for knowing this quarter second exposure with 57,000 ISO at F four. Okay. All right, cool. Thanks for explaining that. That helps a lot. But yeah, this was the meme going around also comparing how Flat Devil wears PR and Harry Potter reboots are compared to Yeah. That's funny. That's funny. That's pretty good. The internet always winds with the jokes, like bring back more DERs to Hollywood. Yeah. Should have fired them all. Let's talk about actual like AI stuff. Enough. Enough space nerding out. Okay. So Netflix came out with their first AI model, open source called Void. Video object and interaction deletion. So it's basically you wanted to remove an object from the video that's been doable before you can like remove something. But what Void does is it not only removes it, but it corrects the physics for this object being gone in the video. Wow. So in this case, let's see, what do we got here? They're spinning some tops. And some hands are moving the tops, they remove the hands and then the tops stay the same. This one, they have a weight on a pillow, so the pillow's sinking. They remove the weight, but then the physics of this pillow sinking is correct, or the pillow doesn't sink anymore. Whoa. That's crazy. The domino one is the most obvious one. This one, uh, I can't rewind this video, but Roomba was like knocking over a couple of dominoes and then they removed like three out of five dominoes. Then the physics were corrected for like the domino at the end. If they remove the middle dominoes, the domino at the end doesn't fall. Whereas in real life it did fall because of the other, the domino effect. That's crazy. This is pretty wild, that it is like something that would be common in film where it's like, I need to remove this, but I also need to fix all of the physics so it makes sense. So like this, this sounds crazy. This girl is blending something up in a blender. It removes the girl, but then it also keeps the blender off. Yeah, it's doing a lot more than in painting. There's definitely some reasoning going on where it's figuring out, like for example, what the blender is actually doing and then what it would do without the person there, and then generating that and placing it and where the old blender was. Yeah. Let's see. It's doing two passes. Voids first pass generates a physical plausible counterfactual video with the object and its interactions removed. If the model detects object morphine, then an optional second pass reruns inference. Using flow warped noise. Mm-hmm. And stabilizes and cleans it up so it goes through twice. Yeah. Yeah. The warp noise thing is also their own white paper. Remember we covered that flow? Yeah. I guess. Yeah. Go with the flow. Yeah. Yes. That's cool, man. Shout out to Netflix for dropping an open source model. Yeah, that's, this is like an actual. Super. This like goes in the bucket of corridor crew. The green screen, yes. Here cleans up the spill of just like very practical, real world things. You need to clean up in a video that would cause a lot of headaches to like correct all the physics and everything. And now it could just do it in one pass and you're working with real footage and material. Very VFX friendly. Like it's something we all can understand and try to solve. And historically all these things are hard, right? Like chroma king well is hard. And in this case, not just removing object, but reinserting with a similar object is also hard. I'm looking at, I can compare it against other models. Interesting. Yeah. So here's a comparison, like Runway. You say remove and this T-Rex is not gonna T-Rex over and the T-Rex still falls over. I'm trying to find out what the output potentials are. I don't really see that of like are you of like what you could give in and what you can give out. I'm hearing something really. Interesting on that. What a lot of AI filmmakers are doing now is they have an entirely different pipeline they're calling finishing pipeline. Okay. Where they're figuring out like what the upscale resolution needs to be, what the color space needs to be, and if there is dynamic range up, a conversion and all that. And that's like an entirely separate thing versus like generating the frame. We tend to kind of think of those as the same.'cause we come from physical production, right? Like the camera sensor resolution is your resolution, like everything is just, uh, below that down sample of that. Are you saying they like are thinking of processing their footage and then they get the like 10 80 output back and then some pipeline to like up res. Again. Yeah. It's like, you know, there's a combination of Topaz and Magee. Yeah. And a bunch of other things to like get it to where it needs to go. But that's like a separate portion of your pipeline. Are these people, I mean, yeah, that that is a deal, but are they like dealing with hybrid footage or real life footage, or is this more for like a fully generative pipeline? I'm guessing it's more for fully generative, but I don't see why it wouldn't apply to hybrid. Because when you put it's camera footage through this video void video model, you're getting something way less, uh, pixels than what you're putting in. So you're gonna have to go upscale again. Yeah. But the thing with the upscale models, like the more the, the diffuser models and stuff like, um, TOPA Starlight is there more for like, there's, they add a little weirdness if you give it real footage. Like they're good, they're really good for, yeah. You have that AI output footage. And you're working sort of in a full AI pipeline and then you need to like go up, but. They're still not the, it still can add weirdness if there's like, if you run it with real footage, they've gotten better. I mean, I'm still thinking though, like, and I know this is more of our like Yeah, in production world of, of if you shot, you know, you show it with like a or Black Magic and you have your original footage with like B RAW and you give it to this model, to this model you like as much as you can. You still wanna stay in that. Space and have that latitude because when you fix your VFX shot, you still might have to do other elements, might have to do some more compositing, have to send it to color, wanna still have that color range and latitude to color it with all of the other shops that are surrounding it. So it's still important, like the corridor model. We'll give you the what, like what, 16 bit XR outputs and stuff like. It'll still, you can still stay in that high-end space. I'm curious what the void model is. I'm trying to find an answer to that. Yeah, I would guess. I mean, you know, like obviously Netflix is looking to use this or is already using this in real productions so that, so I am sure there's a way to put this in a pipeline and not degrade or destruct, right. Your source footage and be locked into a shop that has the physics fixed, but now you're in an abit. HD video that you're, let ask you this, why do you think they released this as an open source model? And I was wondering that, what would they get out of it? I don't, I'm the only, uh, fan I think I, I could think of is like attracting talent to come work. At, uh, one of their labs. My guess is this is just one tiny aspect of a AI native production. They're putting it out there to see how other people cobble this together or modify it and turn it into something completely different than what they're thinking. So it's getting developer time. Production testing time for free if they put it out there and just kind of watch it evolve over time. Mm-hmm. Okay. So like they got a kind of this kernel of a thing and wanna see what other people do with it so they can adapt it to their own stuff or adapt it to other internal things they're working on. Yeah, I mean that's how, like Linux became, what it is today is, uh, when, um, when Linus, you know, open source, all of that. Other developers were working on it on their own time, and they would add like, Hey, Linux doesn't have a clock. Let me add a clock feature. And now it's open source with a clock, and so on. And, and after, you know, decades of evolution, Linux became this really enterprise grade, grade ready stuff, all because of, you know, individual micro contributions from developers working on something. Open source. So I think in that same sense, like you put this thing out there and then people who are actually gonna use it are gonna fix and modify things that it needs. Um, and it's just gonna happen organically. I can't find an actual number for resolution. Just do some quick search in the actual paper, but in their conclusion and limitations in future work, future work could obtain better training data sets beyond rendering engines. The generated video links are still in the range of a few seconds, and resolutions could be further improved. So it doesn't say what the resolution output is, but I'm guessing it's either seven 20 or 10 80. If it's nothing to brag about, they're not gonna say it. I mean, yeah, the resolution could be better. Yeah. And it's also probably not the concern for them, like they're trying to solve the actual problem of physics-based replacement. In, in painting, right? Like that is the core of the problem that they're trying to solve. So like, yeah. Resolution, that's another research paper. That's another thing we're working on. Mm-hmm. We're gonna tie those two, two things together. The, the use case could also still be like, if you are in post and you're still like, you know, doing rough cuts of assembly cuts, uh, or even just later cuts. But hey, we wanna see if this like. Shot works without this thing or this physics thing happening, or this person removed. Do a quick run. You still have it in your rough cut and then just make the determination like, oh, okay, does the shot work in the story or what? Whatever needs we have before then we commit and. You know, send it to VFX. Uh, but you can basically like filtering out and making sure like, yes, we definitely want the shot. We know it works before sending stuff to VFX, that, that gets worked on that Yes. Hundred percent ends up getting cut. Uh, 'cause you don't need it, so, oh, I love that. Yeah, that would be really useful. That, that could be another use where it's like, yeah, we're not gonna ship it with this, but we just wanna make sure that we're using VFX time, uh, to the best of its to, you know, best capacity. Sure. Okay, cool. Yeah, I like that. Uh, ship it out in the world and, and see what happens. I mean, the same thing happened with, uh, the corridor. Yeah. Green screen removal where people took it and then figured out like within a few days figured out how to, like, make it run more efficiently and did all sorts of other things with it. Yeah, so, and quick shout out to the open source models in general, like mm-hmm. I think the Netflix void model was built on Cog Video X, if I'm not mistaken. Like that's an open source framework that you can use. And, uh, there's another one called Vase, which Alibaba makes. And uh, like if it wasn't for these open source foundations for them to build models on. This stuff would not be open source in the first place. So again, you see where I'm going with this? Like Netflix built this thing on top of an open source model. Somebody will build something else that's open source on top of this model. Yeah. And it just gets better and better and more specific and more niche. Yeah. Yeah. Okay. Next story. Mm-hmm. Ready? Yeah. GPT image two. Potentially. There's some new stuff showing up on El LMArena new model that was doing some crazy stuff with graphics and text rendering. Uh, people suspect it's most likely the next version of GPT Image two. I think it's been pulled down since then, but most likely something is coming. I thought it would maybe be this week, but maybe it's next week. I am already hearing rumors of this being a Nano Banana killer and all that stuff. I mean, where I've heard where we at rumor. Two here, possibly better than Nano Banana Pro. Yeah, they're killing the banana. It looks like it's done really well as like this one shot YouTube interface, like a lot of just really like complicated screenshot type stuff and layouts and texts. That looks like real stuff. That's crazy. That's generated this, yeah, this whole thing. Wow. This whole like screenshot that looks like a YouTube screenshot was generated. That's generated this shot That looks like a iPhone. Like the both iPhone boring mall photo bath and Body works. Yeah. This was generated, this map. I don't know why we need this generated, but I mean that's, that's insane. This human anatomy. Body thing. Wow. Yeah. Lemme try to find something that actually looks, I I, there were some better ones that I saw that, like, someone made a, it was like a fake thumbnail, but in a YouTube screenshot ui, and so the whole thing was generated, but it looked like, oh, they took a screenshot of like a YouTube video. That's impressive. If these are true. Yeah. Uh, it had. I guess there's three models. It had three code names, masking tape, gaffer tape, and packing tape. The, the one that's most impressive to me is the Bath and Body Works image. I mean, like, that is reality to me. Like there's, yeah, this looks like a iPhone pick of like a closed store for, yeah. Of like somebody who's just bored at the mall. I mean, I can't wait to run my 1970s New York City Street test on this. So let's go. Let's go with Chad GBT. This one handwritten looks like a shot of handwritten notes on a notepad. Oh, people are doing some comparisons with other models here. GPT image 1.5 looks like an AI image. Wow. Wow. Look at the lighting on her and the hard cha. Even the shadow from the photographer. On her. Look at that. Oh yeah. Yeah. That's crazy, dude. Yeah. And the weapons Details. Details in the background. Yeah, I know. Yeah. People look not warped. Yeah. I mean the fence is a little unrealistic, but I'm nitpicking. But that is so much closer to reality than the other one. Flower lop. Is that intentional? I think that's the, the. Prompt from the person. Okay. Yeah. That, that's their handle. Lemme see The dirty pantry thing. The unor, that's, that's not, that's, that's not really this just, it's just somebody's phone. It's just from, it's from 2023. Okay. It's not, uh, this one also claims it was GPT image two. And of course it could, these could all be, I mean, um, the Obama Trump selfie, if you go back, like the skin detail there is spectacular. And the Oval Office rendition in the back. I mean, that is the wallpaper for the Oval Office. And also, you know, we could do Nano Banana and. You know, comparisons, but ju A, if this is true, and but B, just GPT image 1.5 versus two.'cause 1.5 still had the like waxy AI skin. Yes. This has fixed a lot of that AI look so. Yeah, maybe. Maybe it is a banana killer man. Maybe Nano B has its days coming. I don't know. It looks pretty people like here's not a Banana Pro two. Right, right, right. And we dropped that. I mean, at this point we're just taking a given that text is gonna come out flawless, which is crazy. Like just a year or two ago. Text was so impossible. I know. And now we're like, oh, cool. The big text is sharp, but like, I need every text all the way in the back here to be like super sharp shoe. Yeah. Like on the bottles. Like, no dude, that's out of focus. The other, the other commentary about a lot of this stuff was just like boomers. Boomers are really cooked now on Facebook. Boomers are cooked with 1.5. Like with the plastic. I'd be like, boomers, man, I think that's not fair. I think everyone's cooked like, so this like is. It's so hard to tell. It's so hard to tell what's real and fake. And if you think you can't, I think you are. Over, over rating your abilities. Yes. Yeah. Yeah. You're, you're no better than stuff, than the rest of us now. I think. I just like think every, I'm just suspicious of everything now, which just made me suspicious of, of every image. We're so cooked as a society. I mean, like, it, it is the, some of the war stuff that's AI generated, you and I talk about that. It's crazy. Oh, got the Lego things coming outta there. That's got the fake net Yahoo stuff with the six fingers and the pockets and then, um, I mean the Lego stuff. It's funny, it's entertaining and uh, it's stylized. It's sort of, but it's also like about, it's the craziest, or it's literally two countries at war. They're like doing disc tracks about each other with Legos. It's like, yeah. And they're trying to win like the viral meme moment of the day, which is crazy. Like they're competing not just at a war level, but at a, like a social hearts level. Minds on social media. Yeah. A weird timeline. Yeah. It's what a world we live in. Yeah. All right. I can't find the one I'm thinking of, but, all right. Oh wait. Literally right here. Oh. Oh. This is one of ours. One of ours. Oh, that's you? Yeah. Is a, is a, yeah. Was a VP that post this you this, well, it kind of got cut off, but this whole UI screenshot thing with this, uh. You generated them? I traveled to the middle, no, not me, but this was the image, this was the image I was trying to find. This was someone made this image allegedly with GPT image two on El LMArena, and it was more of just showing like in one shot they made the entire, it replicated the entire YouTube interface and mm-hmm. This, you know, if image of them, like as if they're vlogging from the middle Ages. That was, yeah, that was the one I was trying to find. I mean, my, my sort of, uh, like. Simpleton brain is thinking like, this is just like a template, like a YouTube template. Anytime your prompt has make a YouTube da, da, da, it just goes into the template. It's not even using ai, it's just inserting text into a template. Oh, like it already just has the YouTube UI already, like I know it's like. Yeah. Yeah. It's like it's more web design than it is a generation, you know? Yeah. That part of my brain is just like, I mean, okay, cool. It can replicate the entire ui, but you could also literally screenshot it and just like cut out the image, change the text image, and put the image in there. Yeah. I mean, what, it'll be impresses it is. If you're like, design me the UI for a new app that I'm trying to do. Exactly, exactly. And how well it does that, that would be impressive. And um, my guess is that's where like Figma and um, is the competitor Canva and some of their AI tools are gonna be. Yeah. Speaking of the, um, deepfake and, and brain fry, this is gonna be a boon for those, uh, millionaire. Influencer people who like show their PayPal or their Stripe account, things be like, look how much I make in a day. And they show their like Stripe dashboard. There's gonna be instant for that. Be like, give me a Stripe dashboard where I'm just making a million a day. Exactly. Yeah. Yeah. I mean I, um, I hate cash. I can't believe I'm saying this, uh, on the podcast, but I have, uh. I have fake some school notes for my kids' schools before, like I think it's okay if you're doing it and not, not them. Yeah. It's like, wait, they, they, they forgot a doctor's note for that day. Let me just generate it with Nano B and I do it and they've accepted It's fine. Is that bad? Yeah. No, I, I think schools have gotten more ridiculous. I remember, I remember when I was like super young, I don't know, we went on like trips and they, we would just get my schoolwork ahead of time and the schools were like, cool with it. Like, yeah, yeah. Here's the work he's gonna miss, like, you know, have fun. And now they're like so picky about attendance and it's like, what? Yeah, I think the conspiracy theory is that they get paid, schools get paid for per student per day basis. Counts. Yeah. Oh really? Yeah. So like it's, they lose, literally lose money. If somebody doesn't go to school, they're like, no, no, no, no remote work for you. It's return to office for it's return to office for you kids. Yes, exactly. Five days a week. We're, we're paying for this beautiful, uh, office space. It's about the community here and the comradery. We need your children for the money. We need to harvest them. Okay. Next update, Seedance 2.0 Yay has been rolling out, and every time I would see a company be like, we got Seedance 2.0, and then in the fine print, it'd be like, except for the US it was like, this is the most, yeah. Why is that reverse rollout? I think because, you know, they had such a splashy launch with Brad Pitt and Tom Cruise and then got into the like legal. You know, everyone threatened to sue that, like all the studios started to prepare their lawsuits. Yeah. Like, Hey man, and then we're like, oh, we'll roll it back. But then in the meantime, they were holding off on rolling it out in the US and then started rolling it out in other territories. But they have technically rolled it out in the US but I, I messed around with it on fall. I don't know if other. Tools have slightly more open access, but, um, it's pretty, it's highly, it's highly restrictive. It's re, it's redacted. If you have anything in your inputs with a human face, it will reject it. So it's kind of, oh, honestly, it's pretty useless right now. It's, um, it will reject that it caps you at seven 20 p output, so it's also kinda useless. All of the whole like a can generate like multi minute, you know, coherent short films. I mean I'm sure that is possible in the version they ship. That's not possible. It's a 15 second cap. There's no multi-shot prompt, understanding like how cling you can, has a structure where you can, you know, prompt for multiple shots. In a single generation it doesn't do that. A lot of stuff that was like in the initial announcement, I'm sure it can technically do it, it's just not. They don't let you do it right now. Well, that's just a big can of disappointments, man. Yeah, and I messed with it a bit. And, uh, I emailed, uh, contacted fall. They said that about the face thing.'cause also it was like when they initially rolled it out, it was like you had to verify your business and promised some things. Okay. Um. And the foul was like, we're the, there should be less restrictions net next week we're working on it. Okay. Okay. Okay.'cause they know this is gonna die if they keep this up. I mean, preventing faces period is kind of pretty extreme. Right. Like just any face, it's just like, Nope, nope. Like even when the Sora, the mobile app, uh, they allowed faces right? Then they prevented you If it was like a, an IP face, but like we're just giving it I'm trying to, testing. Yeah. What? Yeah. What about a synthetic person? Synthetic face? No. Any face.'cause that's what my, yeah. I was testing with IPO testing, like an image of a fictional human character as an input. And then it was like, no, if it has a face. No, no win. Just a no. Yeah. Alright. We'll give it more time to sort out the kinks, I guess. Give it more time. It's, yeah, yeah. Pretty extreme rollback, but, um, okay. Yeah. Obviously we've seen, we know what it's capable of. It's, we just haven't put that power into the hands of, uh, everyone yet. Yeah. I, I mean, do you remember when the Tom Cruise, Brad Pitt video came out around that time? A lot of, like bigger AI YouTubers had like, um. Early access to Seedance two. Yeah, that's possible. And some of the stuff that they generated was pretty solid. I have not gone down the rabbit hole of like, yeah, lemme try to turn on A VPN and try to go to some China party tool who has access with ByteDance to get like a less filtered version of the bottle. I'm sure there are ways to dig into that. Yeah. I've only been interested in like. Above the board cleared. Right. Legal ways. Right. Also 'cause of, you know, like clients and stuff I can't use. Yeah, yeah. I mean you're doing it for commercial reasons. Yeah. You can't risk it. Yeah. Like it something where it's like, yes, we can legally access this through foul. We're not skirting anything Joey, anything but Joey, the weekend guy who's just having a beer and he's on his phone, can that Joey, the weekend guy, generate stuff through a VPN. That's what I'm saying. I'd mean I'm sure Joey, the weekend guy, if you had more time, could probably figure that out. I'm sure it's possible. I just have not gone down that rabbit hole. Okay. All right. All right. Yeah, the weekend guy, we're gonna have to have a, a showdown too. Enjoy his bourbon. Uh, I think BW Seedance two is out for good, like, and has all the bells and whistles. You and I should do a showdown like a quick 32nd. Yeah. Yeah. Something with more features. Because like right now with like what's publicly available, clinging is still, in my opinion, like the best for sure. And most control and most options. Um, CL three Seedance isn't, uh, CL three. Yeah, Kling three is very good. And then until, you know, maybe once Seedance like rolls out. It's like you can do more stuff than like the next day, VO four is gonna come out. Google's just waiting and they'll be like, and drop. So speaking of Google last update, uh, last week, they released their biggest open source local model, uh, Gemma four. Uh, so these are built from the same world class research as Gemini three. Gemma four brings breakthrough intelligence directly to your own hardware. Mm-hmm. For advanced reasoning and ag agentic workflows. Amazing. So it's a open source distilled version of Gemini three? I think so. Um, that can run, it's a couple flavors depending on your si your memory that can run. Run locally and they have a small version that's designed to run on smartphones locally. Wow. Really? Yeah. Is this what, uh, Apple's gonna license for Siri? I thought they were licensing Gemini or something. Right. Yeah. That's pretty cool that they did that. I think they're, I think, good point. Yeah. Yeah. Possibly. That makes sense. I guess for like smaller tasks, like I don't think it, I don't think the local phone model can do that much, but it would probably be good for just. On the phone stuff like, yeah, create a calendar event or send a text message. Text message, yeah. Yeah. Very basic stuff. You don't have, you want something fast, you don't have connect, maybe you don't have connectivity or if it has access across all your apps, you can have it do things in those apps, right? Like it'll click for you and swipe for you. And I hope, I mean, I hope Apple's thinking along those lines, I think that would also be a matter of. The apps Building something or Apple developing their own version of like an NTP. Yeah. For apps. Yeah. Because, uh, for a, for a AI model to like load up a website or an app and then look at the interface and then click and then do the thing, like as a human, it's very slow and very like resource intensive. Whereas if it had the sort of like an API, but with MCP, like the.'cause the access points already defined so that it could access and do things, then it's like way, way faster. Right, right. No, look, so I would, you know, now we're talking about that. I would, that's probably, I would highly expect something like that to happen, like on the app developer side. For them to be like, Hey, app developers. We have like a, I am sure they're not gonna use MCP. I'm sure they'll make up their own thing, but it'll be something like that for the AI chatbots to access and do things right. I think, don't they sort of have that with Siri? Like a way for Siri to access, if you wanted, define it like Siri can access. Things in your app? I don't, I don't use Siri at all. Like, so I turn it off. What's the weather? What time? What's the weather? Yeah, no, set my alarm. Siri. Set a timer. Yeah, super limited. So whoever the, they announced. It's not announced, but they, I, the media has identified the successor to Tim Cook. I forget who that is. Uh, it's like a senior executive at Apple. Yeah. I hope when that person comes online, like these are the things that they're thinking about.'cause iOS can go from. Like an operating system to like a vibe coding platform where you are literally building apps within your phone all locally to do the very specific thing that you're trying to do. Yeah. Or, or even just an easier way to like, 'cause they have the shortcut tool, which I know is kind of powerful, but it's a little clunky to use. It's clunky. So if it was, yeah, there's just like, I need, you know, when I tap this thing, you do this thing and it's just like, okay, do, here it is. It's built. Don't get Addy and Joey started on Apple. Talk. I think we're gonna go off on it. We're too big of Apple fan boys, but yeah, no, no, what you're saying, like, 'cause my, look, my open claw thing is sometimes useful. Sometimes like I wanna like slam my head into a wall. But one of the nice things is like when I have it loaded up on Telegram and I just hit the mic button and I'm just like, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah. And then it gets it and it like does the thing right. That's what it's like. Uh, this is like, that's powerful. It's like one of those tastes of like, uh, this, like this is how the thing should be. Like this is how Siri should be. Exactly. I shouldn't have to keep doing this through Telegram. Yeah. So. Yeah, that's, you know, what we would hope they are going towards. And yeah, you also probably have, I don't know, a hundred apps on your phone, right? Like, you don't wanna be going through each one of those and modifying things and doing like, uh, who has time for that? Just talk to it. Yeah. And tell it what to do. I will say update. This is gonna be a slight rant. Slight, I kind of get it. But, uh, in the open claw world, the biggest upset this week has, or maybe it was last week, has been, um, Claude has revoked. OAuth use on Open Claw. So basically I had, I have like Claude Max and I was able to sign into Claude Max and use my Claude Max plan with open claw, which solved the cost issue because then I was just using my like monthly plan limits, um, which is great 'cause I don't have to worry about like the API like jacking up every time I used it. Then they sent out like an email being like, we're gonna revoke OAuth access to open claw. Uh, you're gonna have to like pay per usage on, uh, open claw. Uh, we'll give you like complimentary$200 to like credits to use for it. But then once you're out of those, you're gonna have to pay to use it. Pay. Yeah. And I'm also convinced that the free credits they gave are from the dumbest version of Claude I've ever used, because Open Claw has also gotten so much worse since they switched this, and I don't know why. It's like, and my theory. It's crazy. There he is. They're, they are, uh, given the crappy version of Claude to open cloud. Yeah. Just, just so you're frustrated and you pay for it. You pay for the nice stuff just so you're frustrated and you pay for it, or you give up and you go use their open cloud competitor tools that they've been That's, that's right. Is it just called computer or machine or something? It's, well, that's the thing. There's like three different. Verticals with Claude and they all do slightly different stuff.'cause there's the app, there's Claude Cowork, which can run on your computer. There's Claude Dispatch, which can control a Claude on your computer from your phone. There's Claude code and your terminal. But now you can also connect terminal to your telegram and run it there. And there's Claude. And then there's. Remote control, which can control terminal from the cloud app on your phone so that, as I am explaining this, you're probably like, that's a lot of things. There are a lot of things and there's no one central unified like, right, like. Open claw. Hey, it's just a chat thing I could talk to and you can control my computer. And it's centralized. It's very fragmented. Yeah. Right now they're going after different markets with each different products, but they haven't thought of somebody who would come to the walled garden in the first place. The initial evolution makes that cadet cloud code, which is like a little bit more techie, and then they had Claude Cowork, which is sort of like a watered down version of CLO code. Yeah. For non-techie people and just make it like a similar capabilities, but in a more, uh, cleaner. Yeah. It still uses local files and da like it access your Exactly, exactly. You can work with your stuff on your computer, but you don't have to code. It's not a terminal. It's, and the app. Yeah. Yeah. And then I think, you know, open Claw shot off, and then they've been wedging features from Open Claw into the app that were not initially thought of. Yeah. Or thought out. That's my theory. Okay. That like these extra things have been like added on to, based on how people responded to Open Claw, but they weren't part of whatever longer term vision they had of what the Claw universe would be. What do you think is the near term future of open claw and sort? At home age agentic capability. When it works, it's good. And it's still the best thing I've seen out there of just like, Hey, keep, like I wanna do this thing, can you find it? And then it like adds a feature or finds an ability or adds onto itself and can sort of literally like claws, like kind of connect and do things. Um. It's just can forget things and go off the rails and not remember what it was supposed to do. Super unpredictable. I've been trying to schedule NAB meetings with it. Oh, and yeah. How's that going? Um, I've had to send a couple apology emails. Oh shoot. People. It hasn't done anything like super. That was my agent, not me. Sorry. I know, but like, we don't wanna waste time with your booth Catter. Oh my God. No. The worst it's done is like, it's jumped the gun and like sent a time that that didn't approve, like run by me first. So it like has sent time to be like, yeah, that's great. And I'm like, no, you have a set of rules. You have to like check the schedule and then like check the rules and then like confirm with me and then you email them and it'll just like. Email them first, and then I'll be like, well, you didn't check with me. It's like, you're absolutely right. I forgot to check with you. Sorry. It's like training. It won't happen again. Training a entry level admin who just joined the workforce and they know nothing about scheduling meetings. They're like, no, no, no. You, you have to check the calendar. Like, yes, yes, Joey. Yes. Uh, next time. Yes, absolutely. Would you like some. Yeah. Or in like, uh, parks and Rec when, uh, what was like Aubrey Plaza was, uh, was Ron Swanson's assistant. Right. Who like, didn't really care about her job and then she's like, so I would just schedule everyone for March 31st.'cause I didn't think it existed. And then, but today, but today is March 31st, but it just feels like that yes. Equivalent of, oh yeah, I forgot about that temp. Yeah. I'm too smart to do this. But when it works. And I'm like, Hey, I gotta like, uh, where's this company's booth? And like, what slot, you know, is do I have an opening? And again, can we email them? And when it works and it does that, it, it's cool. I'm like, ah, okay, great working. So when it works, it's great when it, but it needs handhold. Mm. I mean, uh, here's a hypothetical question. Now. They have a service. I mean, they've had it for a while where you can hire a remote admin for yourself for like a minimal, I mean, they're just probably people overseas. Is it cheaper to do that than to just figure out Claude Code and, uh, open Claw and all the credits and the tokens? You just hire somebody overseas and, uh, I think it depends what your level of like patient, like I went into this knowing it was going to be faulty and that I would still as an experiment, wanna spend the time to try to tweak it and get it to work and not. I, I, I want it as not expecting it to work. A hundred percent. Okay. Okay. And I think that, you know, if comes in from your user experience end of like, what do you. Want outta this? Yeah, probably if you're doing this day in, day out and like just didn't want to deal with tech stuff. Getting a virtual assistant's, you know, probably still easier. I still, I mean, I still think that would probably be more expensive. That's a few thousand a month. Oh yeah. Um, yeah, I don't know. Yeah. Yeah, I mean, just, just kind of thinking about it, uh, long term, like where are we gonna go with this? Hopefully all the kinks are worked out and uh, we all have our super agents at home that are basically multitasking on behalf of us. I mean, when it knows the stuff, it's good. Like I've gotten all of the videos transcribed and so if I'm like, fine, you know where the video clips, where we talk about X company, it can do that and 'cause it's already kind of connected to stuff. Does that. So like when it works in those cases, like it's good and you can kind of see where it's going. The, I mean, the problem I'm having now is like open, Claude knows some stuff. Claude knows some stuff, Clickup knows some stuff. I use like some other note taker apps. So like the problem I have is just like trying to unify everything so that. No matter what tool I use, it has the same knowledge and context that like all the other tools have. Right. And that's, that's also been a roadblock of like just having a central source of knowledge. I am. It's wild West. Me, I, I, no, I'm living vicariously through you, so thank you for the update. I'll suffer through this and try to report that. And hopefully some of our viewers are best practices are also living vicariously through you. Yeah. Or if you're doing this yourself too, and you have a. Found anything that works, lemme know. Yeah. In the comments. Hit us in the comments. Joey will try the wild experiment on your behalf. Uh, I didn't save it here, but PIKA launched something that was, um, interactive avatars, uhhuh. So it was like a live, you can create an avatar and then it can connect to a live feed. And so the avatar can like, act, basically it could act on like open clause's behalf. So you could have a Zoom meeting with your. Um, AI avatar. Oh, that's creepy in real time. That's my next, that's my next thing on my list. My freezing was like the uncanny valley. Ugh. Like brain and face from the uncanny valley. Yeah. Yeah, yeah. All right. Good place to wrap it up. Yep. Alex, for we talked about at Denoised podcast.com. Thank you for a wonderful comment on Spotify. We appreciate you and, um, if you are on Apple Podcast, uh, we haven't gotten a five star review in a while. Hint, hint, drop us one please. Thank you. Thanks everyone. Catch you in the next episode.