Denoised

What Ben Affleck's AI Company InterPositive Actually Does

VP Land Season 5 Episode 9

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 32:14

Netflix acquires Ben Affleck's AI company InterPositive, which builds custom models for film productions. We also cover Corridor Crew's open-source chroma key model that's making VFX keying significantly easier, and LTX's new desktop editor that integrates AI generation directly into the timeline.
--

The views and opinions expressed in this podcast are the personal views of the hosts and do not necessarily reflect the views or positions of their respective employers or organizations. This show is independently produced by VP Land without the use of any outside company resources, confidential information, or affiliations.

Corridor Crew with, uh, built their own custom chroma key model 

who you interviewed. Just, I, I think last week. We all love them. You know, I watch all their YouTube videos. 

A few weeks ago, Disney came out with a video that was just like, oh, hey, we actually still have this prison. Which, uh, the Corridor Crew thought it was lost to time.

All right. Welcome back to De nos Addy. Good to see you. 

Good to see you too. 

Busy week. 

You too. 

Yeah. All right. Kind of news has picked up literally this morning, uh, with a big announcement from Netflix. Netflix has acquired Ben Netflix AI company that I don't think anyone knew about called InterPositive.

I was just telling you, I was at Netflix for lunch yesterday. Yeah. And I was sitting with the guys that, that would absolutely know about this. Didn't say a damn thing. 

Secret. Secret. 

Such professionals. They are, 

yes. Alright, so this is cool. This is interesting. So we've all, I think we've all kind of seen the clips of Ben Affleck talking about tech and ai and people were, I don't know, maybe surprised, like, oh wow.

He's really knowledgeable about this. Turns out more than knowledgeable. He's had this company steal, uh, running I think since 2022. So, uh, I was trying to dig into what exactly it. Does, but some of the gist of it was he was building his own AI company that sort of builds models specific to each film product to create tools to help productions, to filmmakers do additional things with footage that they already shot.

Like just color wire arrangement, uh, change camera angles, stuff like 

that. It seems like a very film language focused, uh, model. So from what I understand from this press statement mm-hmm. And just like you, I had no idea this came about of nowhere. Uh, it seems like there. Bread and butter is an actual model, so I'm seeing a lot of similarities with the language of the press announcements, with what we covered with the Brin's interview and how Steria and Moonvalley are positioned both as a production company and as a technology company.

Yeah, reading the stuff that's really kind of reminded me a lot of stuff. Of Moonvalley and sort of what Brynn was talking about when we interviewed him of sort of his vision of every film would have its own custom model and what if you could change angles and what if he could do this? They definitely were training and building a larger model and kind of trying to make a public product.

Mm-hmm. This thing is definitely bespoke individual productions, but what was interesting was, uh, reading the Netflix press release. And Ben Affleck's story behind the creation of the company. He said, together with a small team of engineers, researchers, and creatives, I began filming a proprietary data set on a controlled sound stage with all the familiarities of a full production.

I wanted to build a workflow that captures what happens on set. Yeah. With vocabulary that matched the language. Cinematographers and directors already spoke and included the kind of consistency and controls they would expect. 

I could be, I mean, I'm not an ai, I guess. Kind of am an AI expert. I you, I could be wrong here.

You're pretty 

experty. 

But even if Ben Affleck or his team shoots content specifically to train a model on cinematic language and all of that stuff, it would you, we would need a as much data as all of Ben Affleck's lifetime. Plus more. Like we would need millions of hours of footage in order for, uh, a video model to be robust enough to be ready for any kind of production.

So I'm guessing this is not that and what 

that's No. Yeah, it's not that. Not, and, and I don't dunno if you saw the video, there was also a kind of an announcement video with him and Bella Ria and um 

Yeah. With the Elizabeth Stone, the CTO of Netflix. 

Yes. Yeah. Yeah. And so they got into a bit more detail, but yeah, he's like.

I don't want a text prompt model. He was like, I wanted a model that kind of attaches to a film. So it's like you have to film stuff. Then those dailies get Fede into the model and then you have a model specific for that film based on what you shot to do. Mm-hmm. Additional things like why are racing different angles relighting people, but based on.

I'm, I'm assuming, kind of based on the established look and baseline that you already got from actual production. So it was more like, okay, so he's, you film stuff, work with real actors, make stuff, and then the more you shoot, the more you can train a custom model for that film using their product, and then you can do additional things.

To get more outputs specific to the film. That was my impression. Yeah. 

Uh, and, and we talked about that specific category. I think that's what we, you and I call VFX model or video to video model. Yeah. Where you're not doing novel generations, but you're highly modifying what is coming in as an input. 

Yeah.

And I mean, kind of reading what it does, I mean, it's sort of what came to mind was like a combo of a little bit of Moonvalley, a lot of beeble. Which is 

yes, 

lighting as their bread and butter, and then maybe some other stuff that you could kind of do with like Runway olive or cling oh three. 

I'm gonna add one more to that.

I think, uh, the, the bit that everybody in film is trying to solve is the, um, the gap in dynamic range, quality and resolution. Quality. Mm-hmm. So 

yeah, 

a little bit of rua, Illumina, ray three in there. 

Yeah, 

I'd love to know what 

the outputs are of these things. Like are they getting 4K 16 bit outputs that equate the same footage they might have shot on ARRI or Red or whatever they shot their source stuff with?

Exactly. So if they're, if Ben and the team are shooting actual footage, they're probably capturing it in raw, raw formats. Mm-hmm. And that is really important 'cause that's how you get the high dynamic range input into the training. And then hopefully, um, they are making changes to the output so that it's, it's also high dynamic.

Range coming out. 

Yeah. Yeah. I'm curious what, I'm curious what the, I would assume they would have to get a high level output for this to work in a, in a professional pipeline. 

Wanna know something funny about inner positive? 

Yeah. 

So they, they have a LinkedIn page and uh, I just went in there, I was like, huh.

I wonder like, uh, have they been posting at all or. So it says it's between 11 to 50 employees, which is bigger than I thought, Uhhuh. And, uh, they just started posting stuff two hours ago. 

Okay. I was gonna say, is that the right one? 'cause I tried to find a, a website and there's another company called Inter Positive, but then it was like hungry or something.

I was like, oh, that's, that's not it. And so, 

yeah, yeah, 

yeah. It, um, 

this is tagged on Netflix as. Uh, press release and, and 

stuff. Okay. So it's, 

it's 

the real one. 

So it's as much of a mystery to the rest of the world as it is to us. 

Yeah. In a statement, he said he had been building it in stealth and then surprise Netflix acquired it.

Yeah. I mean, I knew about Artist Equity, which is his production company. Yeah. I just had no idea he was actually building tech, which is crazy. 

I wonder if it was used or maybe tested on anything they did with some of the films, like the, like the recent stuff, like the rip, I'm gonna guess maybe they tested it not in final product, 

just knowing how.

Careful they are about AI usage. If there was any AI usage in the rip, it would've had like a little disclaimer or a asterisk on it. 

Mm-hmm. 

You know, so I, I don't think the rip had anything to do with it. And that was also like about a, you know, they started shooting it back in, uh, about a year ago. I remember some of the virtual production stuff was happening already by like June or.

So of 2025. 

Yeah. The other thing I wanted to note on, 'cause when I was mentioning what products, it feels like a combo of, you know, I mentioned clinging, I mentioned Runway, but I could see the appeal of this is like. Clean and Runway. Obviously the data sets are still questionable and depending what you and where that's gonna fall legally, you know, this is clean dataset trained on the stuff you're filming.

I mean, I'm the, I'm curious how this doesn't sound like a lot of material, so like what are they 

exactly 

getting? How are they using the very minimal material to like build models unless it's also using something like Nano Banana type. Ish where we had our kind of showdown and it's like, oh, you only need a couple reference images to get something.

Good to make additional shots that feel like something based on a few reference images. 

Well, I have a theory. You want to hear it? 

Yeah, let's go. 

Okay. So this is my guess is what happened, uh, when Ben Affleck and Matt Damon were, uh, you know, doing the artist equity deal with Netflix with, uh, the profit sharing from the revenue numbers.

Mm-hmm. Which is unheard of. Right. I think what would happen is, is like, Hey, we have this model. It's, uh, fundamentally different from the other models. It's commercially safe and it's cinema oriented and highly specific, but we need more data. And guess who has a lot of data, video data. Netflix. Like, well, what if you take and buy this company and then you retrain it with the Netflix data and then it's just ready to go and use for Netflix.

Right. So I think it's not yet ready, but maybe it's strategically getting there. 

Yeah. And also in, I don't remember if it was their statement or the variety article, it was, uh, noted that like it, they're not looking to turn this into a. Consumer facing product or even just like a pro product, like it would, it's gonna be strictly for like internal Netflix productions and uses inside Netflix.

So Right. To your point of training more on more Netflix productions, if they have the rights to. Train on raw data. That, that makes sense. 

Yeah. I didn't think Netflix was in the model game, but I, it sounds like now they have one and so the, the, I think the, some portions of the internal Netflix team as well as the InterPositive team will be now dedicated to making this model usable and better over time, which is quite the effort.

I mean, I'm sure the teams at Alibaba that are working on lawn or the teams at OpenAI that are working on Sora are probably 50 to a hundred people, I'm guessing. 

Yeah. Yeah, yeah. I mean, it is a, it is a small team. It also reminds me, and it seems like more of the right approach to going about the Lionsgate Runway deal and the promise there when it was like the partnership and it was like, we're gonna train a Lionsgate model that's gonna be like, you can, you know, kind of spit out stuff for like different Lionsgate films.

And it was like, but there is no Lionsgate. Look, they're also different. Like how would, how would that even work? And then it didn't really work out. Uh, but this is like, oh, the right approach where it's like, it's not a Netflix model, it is a model. For each production based on what you're doing with that production and Right.

It just lives in that box. 

And just to kinda make the comparison between Lionsgate and Netflix, I think Netflix is much more tech savvy in the AI space than Lionsgate was at the time. Yeah. Perhaps that's now changed. But you know, Netflix has been in the AI ml. Base for over a decade, right? Oh yeah. Like all of the algorithms and all of their search recommendations, everything is just driven by, uh, neural networks in very early forms of ml.

So as a company, it's built into their DNA to be frontier adopters. And, um, I don't really see Lionsgate like that. Lionsgate to me feels more like a traditional Hollywood studio, which is more about IP retention and financing and backing movies and making distribution deals and things like that.

Anything else with this story? 

I mean, no, there's not really much. It's still really vague. Look, we have not seen a single pixel of an output from inter positive models, but the fact that, you know, Netflix now has their own capability to build and train models internally, I think this is gonna open the floodgates internally for using AI generated content.

To augment a lot of their productions. Like that's a no brainer, right? 

Yeah. And I mean, in his, uh, video and stuff, he was talking about like, you know what, if we could just kind of shoot more, it obviously sounded like a pitch of virtual production. He's like, what if we could just shoot with real actors and stuff in smaller spaces, but, and not have to deal with the logistical issues of bigger productions, but still have that.

That we can do. Yeah, I, I mean that, that, that, that, that sounded like the, the overall kind of pitch of what he was trying to do. I dunno if you saw, did you see this video? 

No. Uh, can you,

does it have a Gemini overview? No. 

Um. Yeah, look, Comfy. Got some Comfy shots. Uh, yeah, I mean the, just, well this was them talking, this was like some stuff that they shot before, but I, he. They had, I remember they had some demo shots in here of like wire racing and generating other angles. So that was sort of the initial pitch in this, uh, promo video.

But to your point, I mean, I, I think everything's just been experimental. We haven't seen anything actually ship in production with that. 

I mean, shout out to Bryn and Moonvalley. Right, like they actually have a model out that you can use today. And again, I don't think this would be public facing. We probably better be able to use it, but just to see some of the outputs on Netflix shows six months from now, that would be, I think, ideal also.

The Netflix bar is really high for content, and I would imagine that, um, you know, if they weren't seeing initial promise of high quality model, they, this conversation would've not gotten very far. So I'm being optimistic 

here. No, this wasn't useful. Yeah. If it's, they didn't see this was useful in actual productions, whether it's just helping with the effects or, or color timing, um, yeah, I wouldn't, I, I, I wouldn't see 'em getting it that far.

So, yeah. I'm curious to see where this goes in the future. And, uh, 

we'll be covering it. 

You know, this could just be like another, um, e turnouts thing where it just turns out like three months later that they make an announcement. Like, oh, by the way, and, uh, you know, stranger thing, season five, we, uh, we actually use this in half the shots.

Right? 

They just tell you after the fact that that's 

the best way to announce AI news. 'cause you never wanna do it during the week of the movie release. Then it's gonna 

affect, because anyone's gonna be like, oh yeah, I could see what, I see what the AI shot is. I see what it is. Yeah. And I was like, no, you're wrong.

Actually, it was a different, it was a different shot. Next one, AI and chroma key Corridor Crew with, uh, built their own custom chroma key model 

Corridor Crew who you interviewed just I think last week. Um, yeah, we all love them. Uh, I, you know, I watch all their YouTube videos. They have been testing and playing around with Comfy with AI for a while now, and.

Them being super, super highly VFX oriented. One of the biggest challenges in VFX is background replacement and one of the challenges in background replacement is getting a really clean key. Um, so if you have somebody with frizzy hair. You know, and they're on a green screen. How do you keep the hair particles but then get rid of the green behind them?

And historically this has been done, uh, mathematically. So you take the value of the pixel and then you sort of build edges and detect edges, feather things out. And it, it's very mushy and uh, uh, it takes a lot of hand painting, hand work to get it right. Well, especially when you see that he's holding this.

Glass of water, like, so is the glass of water gonna be a glass of green water or clear water? Mm-hmm. Because it should be clear water. So these are all the challenge, real world challenges for VFX crews today. So what they did was, uh, they have a custom model that's specifically built to tackle this problem of king.

And from the results and the videos I've seen and I've seen. Other people use their open source model. It looks really good. Like it does a really good job keying where you don't need as much hand input. 

Yeah. And so they trained it on like terabytes of data, of keying out 

of green screen footage, 

right?

Green screen stuff. Yeah. And so you input raw green screen frame. The neural network completely separates the foreground object from the green screen, including highly transparent ones like motion blur out of focus edges. The model predicts the true unplayed straight color of the fore guide element. E alongside a clean, linear alpha channel.

Linear, linear alpha 

channel. Yeah. 

Yeah. 

And if you go down to the tech specs, it is, it is very, very much VFX friendly, like it outputs, uh, E xr, you know, high dynamic range, EXR, I think 16 bit. Yeah. The resolution is also VFX friendly. Like we're not looking at 10 80 p images. It's gonna be much higher res than that.

Yeah. I mean this is super cool. They mentioned it 'cause I asked them about. There are two parts of the corridor thing, if you like, watch their video like a year or two ago on, um, the sodium vapor process That's right. From Disney, Mary Poppins with, uh, 

Paul Debevec. 

Yeah. And that, and, um, the process they used on Mary Poppins to achieve this similar, sort of similar chroma king.

Where Mary Poppins was shot, whatever, 60, 70 years ago. And with the penguin sequence, and she was wearing the like, um, she had the veil with like kinda the meshy veil and like the Keen was great on that and like a lot of motion blur. And that was a different process, great video about that, uh, to go watch.

But it used a very specific prism that like Disney only had like three of them ever made. And then a few weeks ago, Disney came out with a video that was just like, oh, hey, we actually. It still have this prison, which, uh, the quarter crew thought it was lost to time. So I asked him about that, but then I was like, oh, hey, have you guys been doing gray screen?

And then they were like, well, actually we're training this model on green screen, I guess. Because what this model does is it also kinda removes some of the tint of green spill. Yeah. The 

spill and the ting, 

the subject, that, that solves that issue. 'cause um, yeah, we've been experimenting with gray screen because the AI roto has gotten pretty good.

It could just recognize people. And then with gray screen, you don't have the spill issue as much. 

Right. 

I, I was surprised they went with green screen. But I guess with this model, cleaning it all up, then it. Doesn't matter as much. 

Yeah. Like 

if you, what do you, did you see an advantage with green screen versus gray screen for 

Oh yeah, absolutely.

Yeah. You, you don't want to introduce, um, a, a saturation in a different tone, you know, like avoid green screen as much as possible. Uh, and 

well, no, I mean, so are you surprised they went with green screen then? 

Oh for this 

versus doing, I mean, just in general versus like sticking with a traditional chroma key, like a model for a chroma key background versus going with something that's like a gray screen.

Well, yeah. More neutral. Well, I think, um, there's, there's a lot of green screen infrastructure in the world. We're gonna continue to shoot green screen for a long time. If you go to any psych, you know, it's either painted green or blue. It costs extra to paint it gray and production's not gonna pay for that.

They're gonna just have. Ai it out. Um, so yeah, I, I think, uh, this is a, this is the transition move into AI mats and AI roto. Chroma keying. So maybe the next version they release is probably gonna be more suited for white, gray and green and blue. But for now, because we're coming from the legacy world of blue and green screen, I think we're gonna continue to see some of that momentum.

Yeah, that's a good point. And then this bottle's built. For that. So you don't have to shift your production around to do something gray if that's not available. 

Yeah, like if you go to film Tools, one of my favorite stores, if you're in the LA area, you know what I'm talking about. It's in Burbank. I could spend hours there.

And you go to the green screen section, they have the Roscoe green paint and the Roscoe Blue paint, but they don't have a Roscoe gray paint yet. Like it's not standardized yet? Not 

yet. Not 

yet. Joey tend to make some money off paint. 

Yeah. Going, going into the millennial gray chroma key business, 

dude, we should just buy paint from Home Depot, slap our Denoised gray paint label on it and make, make 

millions, it's film, film quality and just mark it up by four times.

Yeah, exactly. Well, that's what the Roscoe paints are so freaking expensive. 

Is there better science behind that paint that makes it more course suit of, 

yeah, like 

filming. 

Okay. 

I'm sure it is super diffuse. It's not just 

Home Depot paint. 

No, but I think like, uh, with modern tools, if you used Home Depot paint, it could totally work just fine.

I mean, yeah, most of these things would work just fine. Yeah. That's also why we're doing gray screen. Like if you're not, you're doing. Straight up chroma key with chroma key plugins, and they're looking for green and need that contrast, like, and are relying on the AI rotoscope tools. They're, they're, they're all pretty good, 

by the way.

Why are we talking about green and blue screen? Like we, you and I, like, we come from a world of LED volume and this was like yesterdays stuff, right? And we're, we're like full circling back into that again. 

I think now it's like if you can get, you know, you look at something like Beeble, uh, it's like if you can get a lot of the benefits of LED volumes, but without.

The cost, then it becomes appealing. 

No, you're absolutely 

right. Kind of go back to some of the older technologies when they work as well as LED technologies. 

If you, if you look at this, right, if you look at corridor key, you, you look at Beeble, and then you combine that with where Ben Aleph and Netflix is going.

Mm-hmm. Like it's all heading back into a physical sound stage with minimal setup, minimal lighting. And that's gonna be the feature for a while. Until fully synthetic people and backgrounds are good enough, which may or may not happen. 

Yeah. We may or may not want that to an extent. 

Right. 

For, for, you know, especially with actors.

Yeah. I mean, I think there is still something lacking when you can't, aren't able to see. What, what you're filming and like we did a shoe yesterday and a lot of it was gray, green and movie. It. It was a weird production because it was like, once it was lit, then it was just literally like, okay, we'll just move the camera here.

Okay. Like action. Go. Okay, we're gonna move the camera here. Action. Go. And it's just like you didn't have that like kind of beat in between setups where you can like, okay, let like kind of regroup and think. It was like, oh, we just gotta keep going and keep filming. But also not being able to see what we were filming.

Yes. Like and it and had to be like, Hey everyone. Okay. Just. Believe that there is like a factory behind you. Yeah. Of like, it'll be there, don't worry. It's challenging and obviously there are workflows if you level up where you can kind of build a kind of a gray box 3D world, but then you just kind of get into a more complicated pipeline than we had a budget for, for this production.

Sure. But there's ways to do it. So I, I, anyways, I'd say I think there's maybe a, there's still hybrid ish workflows of like the AI gen stuff, but still having something so your whole production isn't shooting in a blue or gray box. All day and has, you know, goes crazy not knowing, like, yeah, 



think that, and that's where like stuff, like what they're filming, Korea or p video come in where it's near real time enough where you can get an, uh, enough of a sense, I don't know, maybe Bebo will have like a, you know, 10 frames per second realtime version, you know?

So, but we definitely need to 

like that, something like World Labs where you can kind of make a spin up a 3D space pretty quickly. Right. And so like, at least if you're filming and moving your camera, you're, you're, you're moving around into a. 3D space for reference. 

Yeah. And uh, with that stuff I'm seeing one of the problems that are gonna, that is gonna happen, and I saw in some of the, um, previews of, of it online is, uh, when you just arbitrarily place a 3D background, a generated background into a composite, and you look at it in real time because the background is just completely isolated from the people, the perspective is so wrong, the scale is so wrong, and that's what virtual production.

Did from the get go is get that right because the camera is calibrated to the volume and you know the X, Y, Z coordinates and you know the units. And so even if you place a shitty, unreal background in there, at least it scale wise and perspective wise looks correct versus putting any generative AI back there.

You're like, wait, my eyes is. Like it's break. It's breaking like and so distracting. 

Yeah. Like you just don't know if that your scale and everything is correct in there. 'cause it could be completely off. 

Right. 

You're not in a real 3D space where it understands everything. 

Exactly. 

All right. That was a tangent.

What were we talking about? Corridor key. So anyways, it's free to open source. 

Yeah. 

And. They are gonna keep approving it and we're asking for contributions. So yeah, I'm curious to see. I'll try to try it out, but I'm curious to see where that, what other models and stuff that leads to. Also, I feel like there's a rise in just kind of people building their own models too.

Like they built a model and then, um, what, what did I see? Oh, PewDiePie built like an L his own LLM. 

Oh, I didn't know that. 

Yeah, that like, I think apparently did as well on a coding test is like GPT five or something. But you just sort of. Built his own model, not quite sure how. 

Yeah, so I, I don't know about on the LLM side, but uh, certainly, uh, on the video side, what they probably did was a full way tune and there is toolkits for you to do that.

So one of the toolkits is AI toolkit, a very generic name where it handles a lot of the, the housekeeping tasks for you. So you just give it a shit ton of data and label it correctly, and then you're doing a full way tune on a, on a open source video model. 

Yeah. You just need to have the data, the training data that you wanna, 

yeah.

And it looks like Quarter Cruise certainly has it, and it, it's not just about having it in quantity, but also labeling it correctly, formatting it in the right, uh, resolution, color space, all that stuff. Like it, it's a lot of work to prep the data before it goes into training. 

And what would it basically need for this?

It would, you would need like, Hey, here's stuff we shot on green screen, and then here's like the, the. The alpha, the mats, we need, we, those kind of pairings. I just need a bunch of that to like, 

um, I believe so. And then, yeah, it broken model into like five second clips, ten second clips of whatever, of different green screen, uh, scenarios.

I think it would just more be green screen heavy than not. So 90% of it could be green and blue screen footage, and then 10% of it could be, you know, post key footage. 

Yeah. Okay. All right. Last one. LTX 2.3? Yes, sir. An update from the, uh, opensource model. What have, uh. What have you seen from this? 

Yeah, so Ltx two is um, is doing really well on the open source community because you can kinda use it in Comfy and you can train on it or you can fine tune it and stuff like that.

So, you know, let's say we want to do a DNO podcast that's fully AI generated. You can just train it on all of our previous episodes and then have a pretty good understanding of where Addy sits and where Joey sits and what they do and what they sound like and all that stuff. Ltx two was that and it, I think it just came out a couple months ago.

2.3 is an update and it's updating specifically on detail. So LTX two was a little bit soft on certain use cases. This seems to, to me visually it does look a little bit sharper. And then also, I guess there is audio improvements because it is a multi model as well as, uh, vertical aspect ratio delivery.

So you're thinking like TikTok delivery, so this covers that now? 

Yeah. Okay. Yeah. And we, at this demo, it's, look, I mean this cat with a fish eye lens looks pretty sharp. 

Yeah. Like if you look at the whiskers and stuff. Like, uh, as the Whiskers move in and out of, yeah. See it doesn't get lost. So like that, that's a good test actually.

Yeah. Cat Whiskers are hard. 

What else we got Scott? Animation. I mean, I feel like it's used a lot in, uh, animation, right? 

Sure. Yeah. Because like you can train it on a certain style, so I would imagine Yeah. Mm-hmm. Animation would be up for grabs. Yeah. A lot of anime stuff. There's so much anime examples. I just inundates the other examples.

And then they did also launch another open source tool that this one caught my attention. Uh, LTX desktop. And so this is basically an open source desktop. It's to generate clips with LTX, but also it's it. 

Looks 

like an editing editor. Video editor. Yeah. Yeah. So it's sort of like you can edit, it's like an all-in-one open source, non-linear editor with the ability to instantly generate clips, uh, on your timeline.

Dude, that's crazy that they're just giving this away. This is like good ip. 

Yeah. I mean, and also I, I'm wondering if you could, like, you know, I mean if you just have a good, if this is a good foundation of, um, just like an open source, non-linear editor that you could just kind of work off and build off those, right.

Exactly. That's awesome. Shout outs to light tricks, man, for giving this out. 

This is cool. And I, I think this is, you know, where more, you know, non-linear editors would go, where it's like you can kind of work with your footage and then just kind of generate on the fly as you, uh, are building it out and, and need more shots, right?

So you're not going, so, you know, less of that back and forth gap of like generating clips and then downloading and bring it to your editor and then figuring out what you're missing and go back and redo it. So yeah, this is, uh, this is cool as well. Uh, I think this is cool that they, uh, open source this.

Yeah, I agree. Um, and we need more of this. You, I don't think you would ever see like, uh, open AI or Google open source, something of this caliber. Like an application level? No, like, uh, ability. Yeah. 

Google, like, uh, open source flow. Exactly. Okay. I guess use, go spin up and use, uh, now, I mean, they should open source flow and then you just still have to have an API key for like actually using that banana.

Vo but 



YI dunno. 

Yeah, no, you wanna be behind that wall garden. 

They, 

yeah. 

Google did just make a, uh, a command line agent interface for Google Workspaces. Yeah. So that like more agents could interact with your email and calendar and stuff, and Google Drive, which was like a, an issue with open claw. So they are leaning into, you know, building connections and stuff for AI agents, which is.

Interesting. 

Anything, uh, anything new on your open claw stuff? 

No, not really. It's kind of been 

shut it down 

as busy week. We 

shut it down 

kind of door. It, uh, there was a tweet I saved that sort of really kind of, uh, captured. The, oh, here it is. It kind of definitely summarized what the experience has been.

This tweet from Craig Hewitt, uh, my current reality with Open Claw, I want to use it more. I know it's the future, but it's so less productive than just using Cloud Code and Codex doesn't mean I'm not using it, and more importantly, I'm trying to build things with it. But yeah, am I, that's been my impression where it's like, I know this will be the way things go, but it's.

Clunky and there's more polished products out that I could do the same thing with. So that, that, that's been my experience 

lately. Yeah. Just put a pin on it for six months and come back to it. It'll be a whole new thing. 

Yeah. 

We're not gonna talk about MacBook Neo, like it just didn't happen. 

Uh, I was busy.

I didn't come on, man. I didn't really check in on that. Yeah. I mean, what it looks like, uh, basically Mac built a Chromebook. 

Yeah. And it, it's using the iPhone chip as its main core, which is insane. 

Yeah. I saw people complaining about that, but then it was like, that's not for you. Yeah. It's a Chromebook.

It's a, it's an Apple Chromebook. It's 

for students and Yeah. 

It's smart for Chrome. Yeah. If you're just basically like 90% of the stuff you're doing is on the web, you don't need a, I mean, it is. Is it more of like a glorified iPad? 

No, it's, it's like a MacBook error, like in terms of build quality. The key.

The key, I mean, it looks like a laptop, but it is this backlit. Yeah, this is backlit. There's a air and there's no touch. Id like the, yeah, they cost cut some budget engineered a couple things, whether it's solid Mac, like it has the, 

what is it? 600. 600 

bucks? And then with student discount, you can get it for 500.

So wait, if it has a, what is it? A 16 

something 

18, 

which is on the iPhone 16. 

Can you run anything that could work with Apple silicon or are you limited on like, could I run Resolve? No, I don't kill it. But could I run, like are you limited on what apps you could run on it? Then 

I'm sure you can install anything you want.

Running it is a whole different equation. I mean Nk, BHD 

No, I mean like can you install, could you run Final Cut Pro on it with its a. 18 chip eight a 19 chip, whatever. It's, 

so 

it's Macs, not M and not Apple. So, and not a, and not an M1, M 

four 

chip. Yeah. So it's Macs. It'll probably let you install it and then when you load a couple of videos, try to edit it, you're gonna run into some performance issues real hard, 

I think.

Look, performance. Different story. Yeah. I just wanna know, can I, like, if I actually try to solve it, is they not gonna be like, this is not compatible with this computer. 

Oh, I don't think they're gonna gate keep like that. I hope not. That would be weird. 

No, I just mean from a pure, like, like if I tried to like, you know when, when they switched over to Apple Silicon and then you tried to spin up stuff on the Intel machines and the newer software and then it was like, this cannot run on an Intel machine.

Or if. Try to load a thing. Somebody built for apple silicon and you're on an Intel computer and it's like, this is, you know, sure. We cannot run this. That's more my question, like, is it, 

got it. 

Are you gonna hit those issues? You're gonna do a stretch? Throw my hands off. Yeah. Okay. 

I don't know. 

Okay. 

But it's cute, it's cheap and, uh, I think, uh, it'll further see.

This is really smart. It's gonna take a lot of the younger folks into the apple wall garden much quicker than anticipated. Like from the moment they're high school students, they're already in the wall garden and uh, they're just in it for the rest of their lives. 

Yeah. And I think a lot of people forget, like 99% of what people do on the computer now is something in Safari Chrome.

Something in Chrome. Yeah. It's, it's all, 

it's a window 

to the world. Yeah. 

Yeah. Yeah. And Chromebooks have done phenomenal, so like yeah. It's a smart move. 

Okay. 

Glad we covered that. I can get one. Yeah. We talked about it. I mean, I'm not gonna buy one. Are you gonna buy one? 

Yeah, I'm looking for a laptop.

Remember? 

Uh, 

I just use Chrome for stuff. I mean, I 

don't really but you, but you, but you, but you need more powerful stuff. 

That's 

why I desktop. You gotta the desktop for that. You got, okay, okay. Alright. Alright. Yeah. Okay. Makes sense for that. 

It's good for the podcast. 

Yeah. All right. I can see that. All right, let's wrap it up here before we keep rambling forever.

Can you just edit all that out, the whole MacBook Neo stuff? 

Yeah. 

We're gonna keep, 

oh, come on. We're gonna keep that. We're gonna keep that. I wanted to derail us with MacBook Neo talk. 

Well, 'cause uh, I remember last year we did the whole Apple event coverage. Right. 

Yeah, we did. And then no one cared. And there I, yeah, I think it only is good if they do something.

MacBook Studio, Mac Studio, or Pro Video related. 

Yeah. I mean, uh uh, our viewers, I would love to hear your thoughts on why you come to Deno. Like what's your number one thing? It sounds like we curate the AI filmmaking news really, really well, and that's why you come here. But then if you come here for.

General tech news, let us know too. 

Yeah, I'd be curious if there's other stuff besides AI that you care about. I, I would be very curious to know what that is. 

Also, um, we wanna know what you think about the Netflix new acquisition. Let us know. 

Yeah, let us know. Thanks for everything we've talked about@denoispodcast.com.

Thanks for watching. We'll catch you in the next episode.