
Denoised
When it comes to AI and the film industry, noise is everywhere. We cut through it.
Denoised is your twice-weekly deep dive into the most interesting and relevant topics in media, entertainment, and creative technology.
Hosted by Addy Ghani (Media Industry Analyst) and Joey Daoud (media producer and founder of VP Land), this podcast unpacks the latest trends shaping the industry—from Generative AI, Virtual Production, Hardware & Software innovations, Cloud workflows, Filmmaking, TV, and Hollywood industry news.
Each episode delivers a fast-paced, no-BS breakdown of the biggest developments, featuring insightful analysis, under-the-radar insights, and practical takeaways for filmmakers, content creators, and M&E professionals. Whether you’re pushing pixels in post, managing a production pipeline, or just trying to keep up with the future of storytelling, Denoised keeps you ahead of the curve.
New episodes every Tuesday and Friday.
Listen in, stay informed, and cut through the noise.
Produced by VP Land. Get the free VP Land newsletter in your inbox to stay on top of the latest news and tools in creative technology: https://ntm.link/l45xWQ
Denoised
Odyssey Shoots Entirely on IMAX, AI First Companies Walkback, and Microsoft's AI Fact-Checker
Christopher Nolan's upcoming Odyssey film will be shot entirely on IMAX with newly developed cameras, companies are reassessing their AI investments after disappointing returns, and Microsoft Research has created a new AI model that can identify when it might be providing inaccurate information.
In this episode of Denoised, Christopher Nolan's new Odyssey film is going to shoot entirely on IMAX, companies that went all in on AI and now rolling back, Microsoft Research new AI model tells you when it might be BS-ing. Let's get into it. All right. What's up Addy? Hey, Hey, hey. You see that? Uh, HBO, uh, two eights baby. Not even that. This is- that wasn't HBO though. That's Apple tv. That's Oh, my, you're trying to make a Studio joke, which I get. I'm gonna make a Studio joke after every episode till the show ends. I know. Old school Hollywood style buffet? No, the HBO, uh, Max that Max is now back to being HBO Max. Oh, that they've gone full circle when they were naming. Yeah. That that was a good move. Yeah, because I mean, HBO, the brand holds such prestige over us. It's such a good brand. Yeah. And yeah, it was. I mean, I kind of get it 'cause they were also trying to roll in the like, uh, no, I don't get it at all. Cooking channel and the other stuff. The thing that never made sense was when they had an HBO show and then they had a max original. Yeah. I still don't understand what the difference, I mean, they're different companies, right? Like, uh, there's two different organizations, like HBO is in New York, and I think Max might be elsewhere. LA probably, I'm guessing. Yeah. But I do wanna say that their social media team, a plus in name rollback, they've been posting memes galore. How do they do it? Making fun of themselves. For rolling back to the original name. That's a Duolingo move. All right. Of actual news stories. Uh, thank you. Just got a alert. Christopher Nolan's upcoming film, the Odyssey, uh, which we've known about. Very excited about that one. Yeah. Just came out so we knew he was gonna film on IMAX. He had IMAX developed new cameras for him, but now it came out that Odyssey's gonna be the first Blockbuster film entirely shot. IMAX, that's Crazy's awesome. And remind me that, uh, Oppenheimer was not entirely Yeah, normally.'cause I mean, Nolan shot IMAX for like, pretty much all of his movies, but it's usually like key scenes. It's like, so it's a mixture of IMAX like shooting scenes on IMAX and then shooting scenes on, on more traditional cameras. Yeah. And then if you see the film in IMAX, it switches from the four by three square IMAX for like the big IMAX scenes and then it goes to the wide screen, uh, for the other scenes. Yeah. So this should be. Entirely IMAX. That's incredible because it's like such a big uptick in, uh, shifting back into film. But this is film 2.0 or rather film 3.0. Right. This is Go Big or Go Home Film. This is, yeah. Now using modern technology to really make film, uh, easier to live with and shoot with. Um, I think on the IMAX article, I read that it's not only just making the camera lighter and less noisy. Yeah, so that was two of the big issues with IMAX. Yeah, they're huge, massive, bulky cameras and loud where you couldn't really use any of the dialogue being recorded. Right. It's also improvements into the scanning pipeline. Yeah. And that's interesting. Yeah. Right. I guess it wa, I don't know what the time the turnaround is, but I guess it was an issue where he couldn't watch dailies the next day of what he shopped. Yeah. I, I'd imagine like a one 70 millimeter film frame is probably like, you know, 16 K in resolution, something super heavy. How long would that take to just scan a frame? Right. Exactly. Even if it's like one second of frame, like I'm sure it's longer than happens. Right, right. But yeah. Yeah, yeah, yeah. And then encoding that, the math, and then uploading it to a server and then putting it up on a projector or whatever the workflow is, that must take a long time. So I wonder what the improved, yeah. Now I'm curious about what's the scanning improvement. Is it like scan it a lower res and maybe res it a bit? Or it's ai? Dude, I'm joking. No, but I mean, I'm wondering is it like an res of like if you can scan it at a lower resolution and res it or de-noise it with a computer? Yeah, I would imagine, imagine maybe there is like, um, almost like a structure where you have the 16 K scan, eight K scan, 4K scan, 2K scan. Mm-hmm. And then the 2K goes to dailies immediately. The 4K will go somewhere and then the 16 K will go to VFX, something like that. Right. And I mean the VFX one, like they don't need that right away. Yeah. That's something you can wait to have that, you know, he wants his dailies obviously, to see what they're filming the same day or the next day. Yeah. But it's interesting 'cause that's a purely digital pipeline, I'm guessing in a world that is, you know, celluloid film. Right? Yeah. I mean, he's not screening his dailies on a projector in a correct projector room. Oh, maybe he is. Maybe, and it's super, he's got that, uh, it's loud in there. He's got that, you know, he is got the, the old school little UHF TV thing that he uses as his director's monitor. Oh, I didn't know that. You ever see, you ever see those behind the scenes photos of him and he has the necklace hanging with the, and it looks like it's like an old, like UHF, like, like with the antenna, with the antenna evidence in it that you would pick up, you know, like, uh, whatever, baseball stations or whatever, whatever's being broadcasted on hf. It's like a tiny CRT monitor. Yeah, it's a tiny little screen. And that's what he uses for his director's viewfinder. I know there's some interviews about with him about it. I think it's 'cause he just really cares about the framing and the performance and he's like not, you know, he has a cinematographer. He is not like trying to check lighting or anything else. It's just to like check out the performance and keep things like pretty basic. But he is used that thing forever. For somebody at that level of filmmaking, it seems like he's not that into latest tech, like could care less about like Sony's new OLED monitor. Don't think he cares about that, but he has, does care about IMAX making a, uh, lighter portable, I mean, that's also just shows his power. He is able to get IMAX to develop a. New camera for this type of filming. Right. And then, uh, did you catch the, uh, comment from the CEO of IMAX on- Yeah. What did he say? That this is gonna be a record year for them? Yeah. In production. Yeah, exactly. 2025, I think is one of their biggest years, and there's a significant uptick in the number of IMAX quote, record number of films releasing in 2025 that shot at least some scenes with IMAX camera. Or quote film for IMAX, which is their other designation, which is I think if you shoot on like a digital 65. Right. Or on specific cameras that are big enough to project on an IMAX screen. Yeah, so it's not technically shot with IMAX, but IMAX consumed shot for IMAX. Yeah, for IMAX screens that kind of follow their pipeline and they're quality standards. It's such a significant shift in, uh, movie making over the last decade. Right? Because there's a clear differentiator here, and you saw, you know, sinners on IMAX, right? Mm-hmm. Like, you know, for a fact that that is worth going to a movie theater for. Yeah. That, yeah, that would specifically I sought out a, like dedicated IMAX. Screen with 70 millimeter projection. Right. Which is hard to find. There's only 10 in the world that we're playing it. Right. So maybe, uh, so it's, it's, it's pretty cool. I think it's good for Hollywood in general and big movie making in general to just have like that coveted, you know, window of theatrical release. Mm-hmm. Actually have a premium look and feel. Yeah. It's good for theatric the theatrical experience.'cause it gives you a reason to go see this in the theater. To have like an experience Yeah. That, you know, you just couldn't match at home. Yeah. I mean, uh, sinner is a big example then, you know, got mission impossible coming out shot film for IMAX, you know, kind of same experience. And then also the Formula One movie coming out. Oh, yeah, yeah. Um, had shot for IMAX scenes. Yeah. I've been wondering if, uh, you know, Christopher Nolan's been shooting IMAX for years, but then everyone, Ryan Coogler makes a 10 minute video explaining IMAX and everyone's going, everybody's with that IMAX. And he's just like, wait, is that all I had to do for everyone to get excited about IMAX to make, make a 10 minute video explaining like what IMAX is like? Yeah, dude. I, I went, I went the Christopher Nolan video. But um, yeah, you know, rising tide, rising IMAX tide, uh, lifts all boats. Lifts, all lifts, all filmmaker boats. Yes, there is other news on the film front. Yeah. What do we got? So M. Night Shyamalan, one of my favorite directors, for better or worse, um, look, you gotta give the guy credit that his movies are unique and doesn't follow formulaic approaches. Yeah. Some of them may have been misses, but some of them certainly have been hits. I think Unbreakable is one of his most unrated films for sure. Even like the last film one with, with Josh Hartnett. Didn't one? Yeah, the stadium one. Yeah. Yeah. Uh, shoot, I'm blanking on the name, but it's like a one word name, isn't it? Yeah. Uh, yeah. Here, there, no, yeah. Something like that. Um, is it here? It, it was a pretty well thought out plot. I think it kind of fell apart on Act 3 and the twist that he's famously known for. I think he became a prisoner to his own success because then there became this expectation of like, what's the twist gonna be at the end? Yeah. So Shyamalan shooting his next movie called Remain on VistaVision, eh? Yeah. VistaVision making a comeback too. So. My personal preference, if you're gonna shoot on film, shoot it on VistaVision. It's way cheaper than IMAX and it gives you a more exaggerated film look than IMAX does. Cheaper because you can just shoot on 35 stock. Yeah. And you don't have to have the special 70 millimeter stock. Yeah. But because I guess of the chemistry and some of the internal mechanics of shutter angles and all those things, it really gives you like an OG Hollywood vibe on your, on your end result. I feel like I should know this is VistaVision still. An existing company or is it just a matter of you have to source these old camera bodies? That's a great question. I don't know that, I don't dunno if they're still an active company or it's just like, you gotta find an old, like finding like a Bolex camera or something that still functions. Yeah. Did you know that Alfred Hitchcock shot a bunch of stuff on VistaVision? I'm not surprised. I mean, it was a big format in the day in the sixties. Right, right. The 50's, 60's. Yeah. I'm, I'm super happy that all of this is making comeback with the added benefit of modern VFX, which makes this so magical. Yeah. And. It just, I mean, it kind of shows that like there could be multiple lanes. It's like, yes, you could have these big temple blockbuster films and throw everything at it and bring in these cool celluloid, thematic extravaganzas. And then also, you know, we have digital so we could like make other more, like a lot more stuff and stories and films. And you can have both. Yeah. It doesn't have to be like one or the other. Yeah. And you can have somebody like Zack Snyder who only uses reds. Mm-hmm. And his movies have, um, you know, or high frame sensors. A red big one. Yeah. High frame rate action sequences, which are perfect for digital cameras. Yeah, yeah. Or, or slow motion Cameron who like, pretty much has to shoot digital for the crazy stuff that they're doing. Yeah. So, I mean, you have just more tools, more, more options. But it doesn't mean, I mean obviously we're shooting a lot less film, but it doesn't mean us to go obsolete. Yeah. No, absolutely. And if anything, I think this is giving film a second life, and this is something we covered in the show in the past and a space we're gonna watch closely because I think it directly ties into the health of the film industry going forward. Mm-hmm. Like how well celluloid film does. Mm-hmm. And how well those movies perform will naturally dictate how well the whole industry performs. Yeah, for sure. I've seen other stories, so I don't know what film specifically did it, but of a film shooting digital and then they print it to film and then re-scan it. To basically like the ultimate film emulation. Yeah, yeah, yeah. Sure. I don't entirely know why. I mean, I understand. I kind of get why you would do it. I mean, 'cause you could have your final edit and then just print that to film and you're not spending a bunch on film stock. Yeah. At some point, like the grain and the film artifacts start to be too much. You're like, okay, this is distracting from the imagery that I would need to follow the story. Mm oh, like, like that. What takes you out of it? Yeah. Yeah. Exactly. Have you had a movie like that? I've been on a binge of like older 70's, 80's movies. Okay. And have noticed, you know, you can see the film grain and a lot of those films were shot with like faster stocks. So they're bit, the grain is bigger and so. It's cool you see it. Yeah. It didn't really, I mean it's more just, it kind of ties into, you know, it feels like the moment of like, the period of when this is shot. Yeah. If I saw it now, yeah. I would feel like it feels a little like Yeah. Unless it was, you know, intentional like, oh, this is like supposed to be a 70's film or something, or like a film that takes place in the seventies. Yeah. And, and have that feel, I feel like it would take me out of it. Okay. Yeah, I agree with you. Have you seen any, I can't think of anything specific that I've seen lately that was like shot in modern times, but has like a. Noticeable amount of, of like film grain. Yeah. Nothing that immediately pops to mind, but, uh, you know, it it like, it just goes to show that visual language that we associate with film, because it's been over a hundred years of it. Mm-hmm. If you don't conform to it, you're not making a film, which is so important for AI filmmakers to understand. All, as much as stuff we can do in ai, ultimately it has to be a visual language that we're familiar with. That's not just the, the film grain and all the artifacts, but also the composition. Yeah, the lighting tricks and all. Like if you look at like what Roger Deakins does when he lights a set, you know what Greg Fraser does when he frames up with a, a particular lens like. They're doing that because that's how it's been done for a hundred years, right? Mm-hmm. These guys are classical cinema filmmakers, and if somebody just comes in, make, makes an AI film, it has the potential to look too much like a video and not like a film. That's the danger. Yeah. Or right. It's just, it's the average of all of the stuff it's trained on is just giving you. The average output. Yeah. And one of the things that I notice a lot with AI, quote unquote films, is like they just can't get shutter angle, right. So like the motion blur is typically missing or it's exaggerated, or it's not physically accurate, or even just the speed of how characters perform it Sometimes, sometimes it's normal. Sometimes it just gives you slow motion all the time. It gives you that 300 look like everything is slow motion and, but the dialogue is real time. So then your brain doesn't know how to comprehend like, this person's moving in slow motion, but talking in real time. Mm-hmm. So I think those are some of the big things to watch out for. If you're making a film with AI. Last thing about film, 'cause it's just reminded me of something else. Uh, Technicolor. Yes. The original film stock, Not the company. Not the company. Because like the original Technicolor was like, it was three separate film strips, right? Like Wizard Oz, style like three. Color processes. I saw someone recreate a Da Vinci node pipeline where they would basically take the image and split it and process it with three separate colors. Sure. To kind of recreate the technic color look using a resolve. So I was like, that's pretty neat. And then also it gives you that very like vibrant, saturated color look saturated color. Right, right. Exactly. That's what it was known for, because I think at the time the film projection. Technology just didn't give you that saturation and the punch. So they almost have to exaggerate it with film stock. That's my guess. Yeah. Yeah. Yeah. Okay. Enough about film. Next story. We can talk about it forever. That reminds me. All right, next one, AI. So there's kind of a couple headlines going around about, uh, how Klarna the, uh, buy now pay later company they had a few years ago kind of made an announcement that they're gonna be like an AI first company, which is what we just talked about with, uh, Duolingo. But they did this a number of years ago. And now they're kind of rolling that back and saying that the AI didn't deliver on what they thought it would. And they're like going back to hiring people. I think a lot of the AI they were trying to implement was for like their customer support chat agent type workflows. It was also very early on. Yeah. And so that's what I feel like when I see these headlines. And then it's like, yaah, it doesn't work. I'm thinking like, yeah, if you were like, tried to jump into this two years ago, it was pretty crappy. Yeah. Like, like the difference between ChatGPT, like 3 and 4 is enormous. Yeah. I mean, three was kind of cool ish. Yeah. But it was like, ah, it still needed a lot of hand holding and to be like, we're gonna go all in on this. Seems like. Yeah. And I think OpenAI just launched 4.1 beta. Yeah. Which is supposed to be even better. Yeah. Which also, I don't, I could not tell you how that is also different from like 4.0 and. Four. Oh, the image generation. No, but it's also just a, it's also a model. It's model, yeah. Okay. They have like six different models. Yeah. One's the reasoning model. I don't, yeah. They've gotta clean up their naming. Yeah. They know this, they're aware of this. I could not tell you what the differences are. Uh, I mean, four oh is kind of the general one, but Yeah. 4.1. Yeah. Yeah. And then there was another, uh, there was another study that came out. So IBM did a sort of AI study on CEOs that have been using ai, and so they're. Study said that only 25% of CEOs report that the AI initiatives have delivered unexpected ROI over the last few years, and only 16% have scaled enterprise wide. So that's really interesting. One in four CEOs who implemented AI in their company thought it delivered, which mm-hmm. Which 75% of these CEOs thought AI did not deliver. Yeah. This is the AI bubble contracting, right? Yeah. I mean, also, again, this is over a few years, so it's like if you were trying to jump into this all in a few years ago, I don't think it was there. It wasn't there. I, it's getting there now. I think now it's like pretty good for like some stuff. If you are like, maybe it's like you got a problem you're trying to solve, you. See if you could solve it with AI first before you bring on a contractor or hire it out. Yeah, and also, I'll just say this with a caveat, like IBM as a AI company, these are probably, my guess is a lot of these are IBM customers. Uh, B2B customers. IBM hasn't really delivered on like, really outstanding AI products. Not, not that I'm aware of. I mean, I'm thinking of like Watson way before. Open ai. Yeah. Right. And they were like pushing that and like when it was on jeopardy and stuff during, in the supercomputer era. Yeah. Like it was beating Magnus Carlson and chess and all these things. Yeah. But as far as like a LLM that everybody's using, IBM hasn't delivered on that and I, I don't know if that's even their market. Yeah. I don't. I don't know what they're doing. Yeah. They've never come up with a conversation. So like looking at this through an IBM lens, of course is a little skewed. I would say that yes, in general, the early, early investors, early onboarders of AI companies probably haven't seen the result that they're expecting. Yeah. Having said that, if you repeat that same experiment today with today's capabilities, I bet you the results would be very different, I would imagine. And people who know how to implement it. Yeah. And looking at the rise of agent AI and Yeah. Connecting all the dots and like more tools. Yeah. I, I think you started this today. I. So like the headlines being like, oh, you know, AI is not delivering. Yeah. And it's like, I think, you know, yes. If you started this a few years ago, yeah, you're gonna have problems. Also, if you look at like the big trends of companies like Meta and Google, if you look at the AI use cases across the board, the number one use case seems to be coding and software development Uhhuh. So like, uh, if this study was done with that lens, that number would probably shoot up higher. You have so many AI powered IDs and software development tools now. Mm-hmm. I mean, that stuff is so good, Joey. Yeah. It, it scary. Very good. Yeah. I'm not a co show. It's hard for me to like judge those tools, but I keep hearing about 'em. Seeing about 'em as we talked about it. I think a while ago I like tried to dabble to like, make some apps, but it was like, I kind of, sort of did it, but I also didn't know enough to debug it and so like Yeah. Or even how to like. Deploy it so, well, I mean, I think I was sharing with you, like I'm trying to build an iOS app mm-hmm. For using AR kit, which is the Apple's native, uh, SDK for tracking humans and things like that. You still have to give it an Xcode environment, a test, iPhone, connect the two things together. Yeah. You have to know how to do that. So Yeah, like once you're past like the basic level of requirements, then you let the AI kind of take over. Yeah. And I can for sure see that if you're already a, yeah. Programmer or know this, like how if you harness it correctly, you could like supercharge your abilities. Exactly. And you just do Yeah. Stuff quicker. Yeah. But you like need to have that foundational knowledge. Mm-hmm. The scary thing is like, if this is all you know, or how all you learn on, right. What's the future gonna be like when you don't know how to like. Troubleshooter, like how this is actually working under the hood? Well, it, it's gonna be like a, like a aging population of senior developers who have like traditional knowledge. And as they start to die out, we lose like chunks of human knowledge. Maybe at that point we'll have a way to like, download Yeah. Brains, right? And so I think we will, and then, uh, we could just, you know, download their knowledge. Yeah, I mean, I, I hate to go off the far end here, but I think I'll do it. Supposedly we're, if you're in your 30s, 40s, and 50s now, chances of you living a hundred plus are extremely high because of, uh, age advancements, like all of the biomedical stuff that's happening to reverse aging. Right. Thanks to, uh, thanks to AI and thanks to Bryan Johnson and. Well, he's just like a Guinea pig, really. Like thanks to all the research scientists that are working on, I, I have bad respect for Bryan Johnson. He gets a lot of hate, but he is like, this guy wants to spend all his money on figuring out how to like, live longer and share it with the rest of us. Like, thank you. Cool. I. Mike Counter did that is he's only been doing this for a few years. Let's let him live to a hundred. Right. And then he did pose, he did tweet out where he is like, I'm probably gonna die in the most like ironic way possible, and I hope you all enjoy. Yeah, no, he knows it too. It's like he can die in a car accident. Right. Exactly. All this will be no proof against that. Yeah. Right. Anyways, so. Yeah. Yeah. But so ai, AI is powering a lot of, uh, protein synthesis mm-hmm. And molecular structure fitting. Jensen talks about this disease detection. Yeah. Yeah. Like, not just cancer curing, but also, uh, reversing aging. Mm-hmm. So, right. Okay. So we could live to be a hundred, over a hundred and in, in a healthy way. And be healthy. Right. Not in a wheelchair, right? No, I would, Ben, like, we're, we're, we're, we're gonna be safe because we'll just live longer, so we'll still be around us. The senior trouble developers, they'll be around for a while, so we don't need to train up new developers. Okay. So our safety is, we're just gonna live longer, so yeah. We're gonna correct a negative with the negative We thought about retiring. Yeah. Tough look, you're gonna stick around for another guys. This podcast, 70 guys. It's gonna be on for a long time. So brace in And speaking of, uh, AI and accuracy, yeah. Uh, we've got, so what is this? Yeah. So let me reframe this for you. Uh, we use chat two bt every day over lives. You and me do, and you, sometimes you ask it really difficult questions. Mm-hmm. Does it ever come back to you with, I'm sorry, I don't know that one, Joey. No, I said that it is, uh, confidently incorrect. Like cocky almost. Right. It will tell you something that sounded very certain, but never quite sure. I mean, it's, things have gotten better since they could access the web. Yeah. But usually, especially 3.5 days, it would tell you stuff and people would use this as a search engine. It would be like. Really like, I mean, we have the stories of those attorneys, like including references to cases that don't exist and, yeah. Yeah. So no, it, it does not verify itself and it, but it doesn't tell you when it's unsure either. Microsoft Research released a new model called Claimify. It's an LLM, you know, just like ChatGPT, but I think it has a couple of. Added features that make it a little bit more accurate. So one of the things it's really good at doing is called disambiguation. Hmm. So when the answer's vague, it'll actually force the LLM into a more precision based answer. The other thing it does is decompositions. So if it's, uh, you know. Blurting out a bunch of quote unquote facts. It'll actually break it down and verify each one independently and then creating context around the answer as well. So it'll probably ask for more context from the user to just kind of shape its answers. So in in results, it shows that clam strikes the. Best balance between including verifiable content and excluding unverifiable content and claim, I is least likely to omit context critical to the facts checking ver verdict. Yeah, I'm looking at the paper and the thing that stands out to me is it looks like, so they had it created like an article about, uh, provide an overview of challenges in emerging markets. That right chat t. The four oh model created and then they ran it through Claimify. And it seems sort of like a highend, uh, fact checker. Right. Because then it listed a whole bunch of like, flag, like it analyzed each statement and then flagged like what the issues are with each statement, and then verified. Yeah. Is it too vague? Is it just factually incorrect? That's exactly it. So it has a sequential, the block within the, the algorithm, it'll go through, it'll split up each of the claims, and then from each of the claims, it'll dis ambiguation process one decomposition process one, and then give you. What it thinks is the right answer. Yeah. Uh, it's really pertinent in the world where we live in a lot of fake news and, and just a lot of, or just inaccurate in, or especially if you're using these outputs for, there's a lot of AI output out there. Who knows if it's true or not. Yeah. Yeah. And then the problem is it gets published and then it gets cited Something. Yeah. It's. The filling circle of like an accuracy. This definitely need it especially, yeah. If you're looking at Microsoft as far as like Bing or you know, like a search engine result. Mm-hmm. I think there you would have to have correct data. Like if you're having like a personal conversation between Jet GBT, it's maybe not as important. Yeah. But if you're like, you know, write me something about this, or I'm trying to research something. Right. Um, but I think we, being that is probably what this is for and why That's a good point.'cause there's like also so many examples of like Google's, um, context summary search. Yes. Like summarizing and providing like the craziest Yeah. Like inaccurate summaries from the results. A hundred percent. So I could definitely see this being used for. Probably why they developed it to be run it on Bing checks and summaries to, uh, make sure they're not like, putting out crazy stuff. Yeah. And, and it's, it's really nice that, you know, we've gone through the last few years of like the basic building blocks, right? Like we in, we got introduced to ChatGPT and ChatGPT improved, and now we're seeing like a level of refinement. We covered, you know, Sakana ai. Mm-hmm. The other episode, which is like adding a layer of. Uh, synchronization and time to generative models. Mm-hmm. And now Microsoft Research is adding a layer of verification, a blue check mark to an LLM. So like it's right. I think this is the most interesting time because now we are seeing this stuff really come into fruition in a useful way. I'm wondering if you can kind of. Give it the sources you wanna check or verify on, like to like be the issue of like, I don't want it to like go find another webpage to check it against, but maybe that webpage is like, from an unreputable source, it's like, can you Yes. Just verify this against either our internal documents or like this PDF report or like, you know, verify news sites like, you know, CNN or New York Times or these sites that Yeah. Already have their own internal fact checkers and run fact checkers. Yeah. But I, I thi this could be a huge, uh, time saver and also just. Help verify stuff like, I mean, I would, I would use this on, yeah, I mean you stuff to double check. You use ChatGPT Deep Research a lot, right? So is that a bit I the use accurate? Is it? Most of the time I mostly use Perplexity because Okay. Perplexity does really well at like the answers it provides, it has citation links to where it found it. Yeah. So then if I need to double check like, is this accurate or where'd you find this? Like the link's right there, I can click it and it'll. Take me to the website of where it's getting that information. Right. And so it's pretty accurate. And on the dot. Yeah. But I'm also not doing the most like intensive Yeah. Like uh, market research type stuff. Yeah. If you're writing a paper for. The United Nations. Yeah. And also stuff where it's, yeah. Like, uh, yeah. Legally you, you cannot be, you know, you cannot be, be incorrect. You cannot make stuff up or be, or be wrong. Or if you, if you're an actual journalist on Associated Press, like Yeah. And you're using ai, like you gotta use something like this. Yeah, yeah. But that's can also just, yeah, definitely speed up the fact checking process right. As well. Yeah. I don't think people are aware of like how, 'cause I've been on the end of like, working with like large media outlets and like. There are people like they're fact checkers. Their dedicated job is to run through all the articles and they'll flag hundreds of lines. They're really good, they're very thorough, and they'll flag everything and ask you like. Where, you know, prove this source. Like, prove this for sure. Prove this fact. Yeah. I think a news outlet depends on the accuracy of its reports. Yeah. Both for like, just them being a legitimate news organization and legally, legally, right. It's good business practice. Inaccurate information. Yeah. Uh, yeah. So yeah, this is a great find as, um, I'm glad to see this. And yeah. Uh, I'm guessing, I mean, maybe you could just, other models can just roll it in or. We're seeing a lot more of what I call cross-membrane stuff. It's uh, like for example, you saw Adobe just started to use the ChatGPT as generator. Within the uh, ecosystem, we're gonna see more like models being used around. Yeah.'cause I think right now the race is like, how can we actually use this at an application level? Mm-hmm. So like if, for example, I'm just giving an example on Photoshop, you're trying to do this thing with ai, but Firefly is just not high quality enough. Mm-hmm. Can you just check a box and use Flux? Yeah, and I think people are getting. Better at knowing like what models or what tools are better at specific tasks. Right. So it's like, are you gonna need to use claim apply for like everything you run? No, probably not. Yeah. But if you're writing a paper or something, then yeah, you'd wanna run it through that. Yeah. What I worry about though, as the future of the AI industry progresses is like if the money is not in the models, then why spend billions of dollars on the models? Yeah. I know you talked about this before, but I mean, I still, I mean, the money's gonna be in maybe not the. The $20 a month products that these companies offer. But in the API calls, right? For all of the backend, for, yeah. Everything that's running through these models, maybe it's that. Yeah. Yeah. I mean, I think that's where I hope is that I just hope we don't get to a world where the models stagnate and we are not incrementally making them better. Mm. Because the economics aren't there. Or maybe they ju, I mean maybe you know, it's a deep sea and you can just do more with less or, or maybe it's that train a model towards more of a specific task. Yeah. You know, it's like, yeah, I know. If I'm trying to do general stuff, Chacha B T's pretty good at it. If I'm trying to like throw a bunch of data at something, Gemini has a big window, so it's pretty good at that. Yeah. If I'm trying to do more writing type stuff, Claude is good at writing, so it's like there's already a specialization. Yeah. Yeah. It's probably gonna be further that, right? It's gonna be more highly specialized going forward and less general purpose. Yeah, I think, I think we're getting, well, I mean, everything changes so fast, but I we're getting where it's like, yeah, all the, all the large models are like pretty good baseline right now. Yeah. And it's like, you know, figuring out the specialized tools For sure. And applications for them. Alright, good place to wrap it up. Yes sir. All links for everything that we talked about. As usual, at deno podcast.com, we have one comment on Apple podcast. We'd love it if you could just throw us a comment on Apple Podcast, uh, a positive one would be highly appreciated. Yeah. Leave a five star review. That's the easiest way. And then if you feel like leaving review, please leave one. You could ask ChatGPT to write your review and just copy and paste it. Something nice. It helps us a lot with algorithms. Yeah. All right. Thanks everyone. We'll catch you in the next episode.