Denoised

Michael Keaton's Google AI Film + The Company Producing 5000 AI Podcasts a Week

VP Land Season 4 Episode 61

Addy and Joey explore Google's collaboration with Michael Keaton on 'Sweetwater,' an AI-integrated short film, and examine a company planning to produce 5,000 AI podcasts at just $1 per episode. They also assess Meta's new Ray-Ban smart glasses and discuss the plateauing intelligence of large language models.


--

The views and opinions expressed in this podcast are the personal views of the hosts and do not necessarily reflect the views or positions of their respective employers or organizations. This show is independently produced by VP Land without the use of any outside company resources, confidential information, or affiliations.

Welcome back to Denoised. We are back in the studio. We're back. Good to be here. See you again. Add Yeah. In person, not virtually. Yeah. It's been months. I know. Just weeks. Yeah. It's, it's been a few weeks, but it feels like months. Yeah. All right. Got a couple stories today. Mm-hmm. So, uh, yeah, let's get into it. First off, I got an important question for you. Have you been following the uh, Amazon Prime saga of The Summer I Turned Pretty. Please enlighten me. Wow. This is, this is already interesting. Well, the grand finale was, was last week, and now you know, you got to figure out if you were team Jeremiah or Team Conrad who got the girl at the end. Okay. Oh, okay. No, no, I, so you've not been following along from the past like three months. So I'm not the demographic for that show. Let's just say that I'm the, uh. The husband demographic where my wife starts watching it and then it's on in the background and then like I get sucked into it too. And then now we're, yeah. Yeah. Now it's over. It, it's finished. Okay. Yeah. So which team are you or your wife? Yeah, we were Team Conrad. And you know, why not to spoil it? Well, you, you gotta watch the show, but you know, you're just watching, you're just like. It's like a 12 hour version of the Notebook. Oh shit. I mean, look, congrats to Amazon for finally having a show that is in the, the pop culture sphere, right? Like this is huge for them. Yeah. I mean, I guess, you know, and that, that's actually a good point too, because like usually the shows that have done really well on Amazon are like a big, strong dude with a gun. Like Reacher. Yeah. Reacher terminal list. Yeah. I mean, fallout does well, but that is also an action fallout. Yeah. Yeah. This is, yeah, that's a good point. It's probably the first one that's like a. Just regular contemporary drama. Okay, I got a show recommendation for you. I still haven't seen Alien Earth, but Alien Earth is so good. Yeah, I, I hear great things. Keeps getting, keeps getting better. Gotta watch Black Rabbit on Netflix. I tried to watch it last night. I fell asleep. Dude, Jason Bateman kills it on that show. All right. I'll give it another shot. Okay. I was, I was really outta it yesterday. Did you say you fell asleep? I fell asleep. All right. Maybe don't watch it. All right. Let's talk about AI stuff. First up, we got another update from Google's AI program mm-hmm. To embed themselves with Hollywood filmmakers. They have another short film, uh, I guess they produced it, co-produced it, uh, but it's part of their AI on the screen program, uh, called Sweetwater, directed and acted by Michael Keaton. Mm-hmm. And written by his son, Sean Douglas. Yeah. Yes. And also Stars Kira Sedgwick. Yes. This is a really interesting story. It, uh, yeah. What's interesting about it for you? What I find interesting is that it, you know, this is like, it's how Fallout used the LED volume, like an LED volume. Okay. There was no like fakery per se, like it was, uh. It was an artificial screen, and it's meant to be that in this example, AI is written into it as ai. So it's literally used. Uh, so his late mother's holographic AI version has the connection with Michael Keaton's. Current form and um, you know, instead of like shot replacement and, you know, replacing VFX and all of those things that we're concerned with. It just kind of plays into what AI will eventually be in the future, which is some sort of avatar for either late people or existing people. You're talking about like AI as a whole, like in. Cultural impact. Yeah. Like, I mean, nobody really knows what AI will be in years. I mean, yeah, I mean in five to 10 years. I mean, I thought like, I don't know if that's, you know, the, except for Sam Altman, he seems to know everything. I don't know if that's the mentally healthiest thing for us to Right. Progress to, to be able to, um, yeah. Speak with, um, past loved ones that aside. Yeah. That aside, I think the only way to recreate someone pretty convincingly is with AI, right? Like, if you can have enough, like if you can train a model on the likeness of a person's mm-hmm. Personality, character traits, all that stuff, then. And AI can replicate it to some extent. Yeah. I mean, that was literally a Black Mirror episode, right? Yeah. So they play into that use of AI into this AI powered film, right? Because he's speaking with like an AI holograph of his late mother. Yes. In the film and as part of the storyline of the film, how many times have I said AI in the past? I just, yeah. Sorry folks. I'm glad it's not a drinking game, so, right. That's interesting. The other part is like, obviously this is made with their AI film collaboration program. Yep. AI on The Screen. The details of what of, of the actual production of the film are light right now. Basically they said they made the film, they premiered it's gonna do a festival run. Mm-hmm. And that was really it. They didn't really get any details of. Is this all generated? Is this something like Ancestra. Ancestra. That's ara the film that was like part life action and then part, you know, a AI baby. Yeah. AI effects. And it was like a very nice blend of, of AI effects. Yeah. It traditional effects. Yeah's a compositing of a AI baby onto a live plate. Yeah. Which we, both of us were really, um, impressed by. Like it was well done. Yeah, it looked great. Yeah. Yeah. And, and, and the, but you know, the bulk of it was live action actors. Um, you know, this one I'm curious about, like, it does, you know, in this one picture of the film, Sweetwater, like, I mean they kind of look aiy, like they look a little synthetic. Like you can tell it's Michael Douglas, but Michael or Michael, Mike Ke Michael Keaton, but, uh, nah. No, you think it's real? No, that's real. That, that's just a heavy, uh, just heavy compression color grading. Yeah. Okay. So then that aside, I'm curious what that, what it is? Yeah. What they used, I mean, I'm gonna guess they use the Google tools like Veo 3 and or some crazy custom version of Veo 3 that we don't have uhhuh and they have. Veo 4. Yes, exactly. No, I, it's, it's really, um. I think a proven pathway for technology adoption in film and tv. When a new tech comes along, you generally don't wanna throw a lot of money into it. So a short film format with a, you know, few million dollars budget. Great. And then, um, it also goes into the film festival circuit, which in and of itself, you know, if it's successful, has a pretty good Runway into getting the film acquired and then eventually made into a feature. Yeah, sort of. I mean, I think, uh, a few million is, is a crazy number. To say for a short, for a short film? Yeah. I mean, with Michael Keaton, right? I mean, uh, yeah. I still think that's a crazy number. Okay. I feel like Google has an acquisition of a short film is a crazy number. And this is like a, I mean, look, it is a marketing thing. Like you get a legit actor, a name involved, he makes, you know, he is involved in ai, makes an AI short film. Yeah. It just. You know, helps legitimize the tool set that they're trying to, you know, get embedded more into more Hollywood productions. Yeah. When we say, you know, AI short film, what sort of budgets are you thinking typically would make it successful? I mean, I, for this thing, I would still picture it to be in the traditional realm of like, if you were making a short film, like. With a decent budget. So a hundred thousand to 250,000 for like a, let's say 30 minute 30, 10 minute. Uh, 30 10. Yeah. I mean, if it's 30, yeah. Maybe a bit more. Yeah. It also depends. Obviously this depends what. Is it one location? Is it a set? Is this and complicated thing? You know, the idea of AI is supposed to be like, well, it's supposed to reduce budget or make things more possible. So, you know, were they using it for exteriors? Mm-hmm. B roll. Yeah. DFX. I don't know. I feel like a hundred thousand is nothing, especially if you take, uh, physical, physical production into consideration. Uh. Yeah. I mean, if it was like a two day shoot, then maybe. Oh, oh, okay. That's Mal. I mean, I don't, I don't know. I don't know anything. I don't know. Yeah, we don't, the, the details are light on this. We don't know how long it is. We don't know what, what is mom, besides the AI thing, just gut instinct. Just having, uh, produced, uh, a show film. I'm talking, making the film. I'm not talking about his, like fees for being No, no, no, of course. Yeah. I'm just talking pure production costs. Yeah. And um, I think a couple million easy. You think so? Oh yeah. Oh yeah. I mean, just to be comfortable and just, these are the two different worlds I'd add. Yeah. I, I come from, these are a couple million bucks making a feature for that. Well, just renting like ARRI ALEXA for a week is like, what, $20,000 or something? You don't think they shot this on the new Nikon RED, uh, $2,200, uh, camera. That then you are, you're playing in a, you're not gonna win TIFF or, or you know, can with a iPhone shot. I get what you're saying. Yeah. Okay. Agree to disagree. Yeah, I mean, I still think a couple million would be crazy just for the, to produce the film. Not, you know, excluding the marketing and the pro promotion they're gonna be doing, you know, for Yeah. Pushing this film. Yeah. But like we're saying, we're speculating.'cause, uh, I would love more details about what this actually entails. I'm not sure. It's very cryptic, just a Google blog it release that. It is just a Google blog post about the q and a at their premier screening in New York, which also I thought, I mean, I guess he lives in New York, but I was like. Interesting. That's New York and not la the other Darren Aronofsky film was also New York based. That's true. That's true. Yeah. I guess maybe Google's, they were all there. Yeah, Google's Google does have a giant office in Manhattan. Yeah. I drive by on the water, right? Yeah, yeah, yeah, yeah. Another converted, uh. Old industrial space, like the one they have here. I'm sure the lunch in there is great. So you never leave, you know. But speed of the budget though, we didn't get into details because I think we were outta town, but um, there was that other OpenAI producing critters feature. Yes, yes. And that budget is like$30 million. So I have some notes on that movie. Okay. Maybe we can cover it in another episode. But it sounds like a ton of CG stuff went into it. I'm sure, I mean, to still have the $30 million budget and seeing the people be like, I thought I was supposed to make this stuff like cheap with 30 million bucks. It was like, well, 30 million compared to 200 million. Yeah, it is a lot cheaper. But yes, 30 million is still, uh, high. Most. With the exception of Pixar, Disney, and Dreamworks, nobody's making a hundred million plus Yeah. Animated movies anymore. The average animated film as of 2020 is when I last budgeted one. A $15 million budget will get you a really nice animated film. Mostly made in India. What timeframe? What time? Over what? Uh, one and a half to two years. Right. And they're trying to do it Yeah. By can next year. Oh, okay. Sure. Yeah. I mean, 'cause I think, uh, you know, the flow is a great example. And it's like that was 4 million. Yes. But over like four or five years. Yeah. So you blend, the more you squeeze, uh, the more you squeeze the timeline, the more the cost expands.'cause you just, the higher, more you need more people on. Yeah. Yeah. 30 still does feel high. Yeah, it does. And I'm curious where the, yeah, like how much is just gonna be traditional, like CG build it out and where the AI lift is gonna come into play. You think they, they made the whole movie with AI and that was like 10 million, then they had to delete the whole thing, make it in CG That was another 20 million. Or just when they made the first version, the, all of the models have improved so much. It's just like, we just gotta redo it all.'cause yeah, everything we had before it, it doesn't work anymore. Yeah, something, something's off about it. So I, I do wanna investigate a little bit more and course, yeah. I'm curious how the numbers break down. Yeah. But I mean, it could also just be because they're trying to do this very fast. Yeah. And also OpenAI is generally not a Hollywood player, so they're probably spinning up new infrastructure to do Yeah. Production management. Yeah. Yeah. And I'm curious, 'cause like Zora has kind of not been in the conversation as much, uh, since all the other models have come out. So like, are they trying to get back in the game or is it also like they're producing it ish and maybe they'll use Soar, but they'll probably just use whatever. Tools are, are available. Uh, I saw a lot of Dall-e stuff, uh, in the articles. Like it's a, I guess a new version of Dolly being used. Is it, I mean, I'm, I'm still wondering too, 'cause also like, depending how the articles were written or researched, uh, if you. Have AI research stuff, it will bring up Dall-e. I've noticed from trying to research experience and like from OpenAI stuff. I mean, they haven't called a product Dali. Uhhuh, I don't think in, in, um, like since the beginning of the year. Yeah. Like it's just shifted to chat PT image. Yeah, and if you like, call up the APIs or if you go into the other tools, it's ChatGPT image, uh, like Dolly as a model name. Yeah. Maybe it's internally referred to that still, but like I, I feel like publicly it has not been called Dolly for like six months. I have an OpenAI comment I'd like to make. Okay. Uh, I was, uh. Watching a YouTuber, like a AI oriented YouTuber, explain it way more elegantly than I will uhhuh essentially, uh, large transformer models like ChatGPT and, you know, Claude and all that stuff, um, they've hit like the upper ceiling of how intelligent they could be. ChatGPT-5 was, I think, trained on like 4 trillion, trillion parameters. Uhhuh and, uh, like the, the intelligence. Quote unquote, intelligence gain was like love di diminishing returns. Right? Like it was barely, yeah. It came out around like, okay, cool. Right. Nice. And so they've fundamentally hit a limit where they, they bet big on scale. Mm-hmm. And now that bet is not paying off. So internally within the AGI ASI community. There's like a bit of a scramble. And it's not just that opening up of like how, what to do next or how to train to do next. How, because I mean, I also imagine like 4 trillion parameters that, that, that's like the earth. Like what else do you, what do you throw this stuff? That's all of the internet. Yeah. Else you throw at these things. Exactly. So, and um, now there, you know, there's some Chinese models that are. I'm not actually, I'm not sure if they're Chinese, but they use a distributed system architecture and it's like, uh, a few hundred million per different node. So it's like, imagine like a hierarchy of models working together, and that's supposedly more intelligent than chat two T five. So, so it's not about scale. And if it's not about scale, then that puts NVIDIA's big AI bet. At risk. Right? Because they were betting that they would build city size infrastructure for not 4 trillion, but for Ilion or whatever the next number is. Yeah. Well I don't think that's slow down.'cause then I just see something this morning that they announced that they're investing, NVIDIA's, investing in OpenAI. Yeah. I saw a post, uh, that it's a a hundred gigawatt commitment. Like they're. Going off power in video to invest a hundred billion dollar in OpenAI as AI data, data center competition intensifies Reuters. Well, that doesn't stop. Doesn't stop them. Yeah. And that's mean also. That's essentially the, that is the theory. They need to keep building bigger and bigger centers and more chips. They're not gonna stop. Yeah. Because the, I think the shareholder price and all of the evaluation depends on it, right? Mm-hmm. They're just gonna have to keep Yeah. Like you make it more chips. Yeah, exactly. But at the same time, like the whole scramble, remember when, uh, we were, uh. We were laughing when Zuckerberg's, uh, poaching, like backfired and mm-hmm. All the folks were just leaving. A lot of that had to do with the fact that, uh, llama just couldn't hit any level of, uh, proper a GI metrics. Mm. No matter how hard they was. They tried, yeah. No matter who they brought in or how much they trained, and it just we're hitting this like very upper limit in the research. Community that, um, I mean it doesn't like affect us day to day or I'm very happy with ChatGPT the way it is now. It's totally useful for certain things, but the companies are betting so big and at such a long timeline that if they don't show progress along that slope, then it just. Dismantles the entire business model. Yeah. And that is a big, like if, well, I mean that was also the fear, uh, you know, with, um, when Deep Sea came out, like beginning of the year. Yeah. And it was like, oh, we could do, you know, you could do ChatGPT and for a fraction of the cost, you don't need all these crazy data centers. Exactly. Didn't really seem to have an effect on how things were going the entire, and also theory that you're saying, like where it's like, oh, you need the big models first to distill them. Yeah. I, I think where, where it will affect us is, uh, the entire Bay Area, Silicon Valley. Economics is entirely reliant on this AI thing, working out and not being a bubble. And if that bubble comes crashing, that's gonna have a huge impact, not just on the US economy, but the world economy as a whole, and like that, that is, yeah, I mean, look, I've heard about the bubble thing. I mean, and also the comparisons to like crypto and stuff. Sure, sure, sure. You know, I mean, this feels different. This feels like, oh, you can very like tangibly see benefits that come out of this and like ways that things can change. Crypto is always where like. How does this make things better? Yeah. Where is this besides pumping up a coin that you made up? I got a neighbor that's a crypto bro, got a brand new car. There are some things I get about crypto that do seem overall like positive, but like there's a big, there was a big lift. Yeah. But, um, ai, I mean, yeah, there, there, there are immediate benefits, but does it support the amount of data centers and chips that are being built? I don't know. I think for the near term, we don't have enough data and, and chips. Mm-hmm. Like, I think the next few years, um, absolutely the demand is higher than supply. So Nvidia will continue to see a rise in, but then it'll plate, I mean, eventually it'll plateau, eventually, or something'll happen, I think. Like underneath the actual architecture for transformers will have to be a new architecture. And I'm guessing that new one will be far more efficient to where you won't even need trillions of parameter. Mm-hmm. And at that point, instead of like using a hundred megawatts of data center power, you need one, and then it'll significantly reduce, like the supply demand curve will completely flip on its head. The same way the film industry has, right? It's like in the past, uh, we couldn't give the audiences enough film. Mm-hmm. And so there was such a giant economy around this limited number of films that were released. Fast forward to today, we have more entertainment than we know what to do with. So it's flipped, and now each piece of entertainment has less value attached to it. Yeah. Yeah. I get what you're saying. So I think the same will apply for AI models, or you have to build a big model and it gets distilled or shrunk into it. Smaller, faster model. Yeah, like looking at, I mean,'cause I'm thinking like even the, you know, the training, the large language models, that's one thing, but also running the video generation stuff takes a lot of compute. And if there's also this. You know, going with like Luma's vision where, you know, personalized videos for everyone and that partner, that ship they have with, um, Humane and, and yeah. And uh, Dubai or Saudi Arabia or wherever. Yes. Like, you know, they're building big data centers and like their kind of vision was like, yeah, everyone's just generating their own personal videos and videos everywhere. And it's like, true. That'll take a lot of compute and a lot of. Power. Yeah. Until the models get distilled and shrunk and reduced. Like you, you can even see with vo uh, three fast, where like that got so much smaller that Google's like, just generate it like unlimited Veo 3 fast, you know, which came from Veo 3, the, the full model. So just stepping outside of our m and e bubble though. Mm-hmm. Like we think, uh, image and video generation is a big chunk of AI usage. It's actually really not. Right. Like I would say it's. Probably a few percentage of the overall data center utilization. If you look at like finance and law and all the stuff that ChatGPT is doing really well, insurance and real estate. You think that's a bigger compute? Oh, that's probably like 90% of what Nvidia is spinning up, that kind of stuff. For those industries. Yeah. Also, what about not even just m and e specific, but like advertising, like image campaigns, like for e-commerce and backpack, like it's big, but uh, it shines in comparison to like. Just take any like real estate, like there is a million transactions that are happening today just around the US probably, and each one of those million transactions is probably gonna have some type of Chad G-P-T-A-P-I call. So you're looking at like millions of calls just for that industry alone. Ad finance, ad legal. Mm-hmm. Add, um, whatever software, co-development, all that stuff. Those I think, are the big buckets of usage. And if you don't see general intelligence there, those markets will, will not use it. Yeah. I mean, I'm also thinking like that is a forward thinking. Like I'm, I'm wondering like what law firms and uh, insurance agencies have like adapted ai, like the edge of the workflow yet? I think, uh, as an enterprise, maybe not too many, but at an individual level, like you and I use it all day long, and I have friends who are lawyers. They use it all the time. Yeah. I get, yeah, I get what, I get what you're saying. Yeah. I just feel like one push of the video generation is gonna be like equivalent to like my text ChatGPT use for the day. Oh, gotcha. Yeah. But overall I can say we're building a lot more data centers. We'll probably need them in the future. Eventually we won't need 'em as much. And then what happens? It's like when China builds those empty cities. Yeah. We're just gonna, we can just convert the data centers into, uh, affordable housing. It'll be warm. Yeah. A big tangent. But, uh, let's come back to, um, well this, I mean, this sort of ties into what we're talking about. So this company, Inception Point AI, uh, built by former Wondery, the podcast company execs. They're launching this new company and their plan is to basically just churn out AI podcasts at scale, 5,000 podcasts, 3000 episodes a week, $1 cost per episode. That was the Hollywood reporters headline of this. You know, when we spun up our podcast, this one, I would, the first thing I was thinking is, does the world need another podcast? That's, no. The world needs 5,000 AI generated podcasts. So, all right. I mean, so the business plan to this is they're gonna spin up a bunch of these podcasts, push 'em out, and, uh, here, inception Point AI already has more than 5,000 shows across. Its quiet, please Podcast Network, and produces more than 3000 episodes a week. Collectively, the network has seen 10 million downloads since September, 2023. It takes about an hour to create an episode from coming up with the idea to getting it out in the world. And I think the play here is bust out a bunch of quantity and then get hits with dynamic ads insertions and make some money that way. The, the, the, the fact that the, the entire storyline of the podcast and all of the content is AI generated, that just really, it gives me the ickes. Yeah. It's like this is the. AI slop thing. Yes. That we're, you know, try to battle against and, and yeah. You know, dissuade fears of just like, oh, you could just produce, uh, the equivalent of like podcast brain rot en mass. Absolutely. And this is the podcast equivalent of that meta AI post with the artificial family that's posting about their vacation. Oh yeah. That or the chat bot. Like talk to the, yeah, talk to the stepmom or whatever. Like that weird stuff. I think it's gimmicky. I think it'll, it'll probably have a short term little spike, but in the long run, I don't see any of this working. Yeah. Right here, the company's able pre each episode for a dollar less, blah, blah, blah. This generally means that if about 20 people listened to that episode, the company made a profit on that episode. I see. You know what, so right. It's, it's a volume play. Oh, I mean, it's, for, all it takes is 20 people for, for, for us, legit. Podcasters, like launching a podcast and building a podcast, putting out in the world and like building up the audience is a pain in the ass and one of the hardest things to grow on. Thank you for watching. Yeah. Thank you for watching and subscribing and, and, and the reviews. But it's a pay, like excluding YouTube.'cause that's, they have a much better model. But like Apple Podcasts and Spotify. Yeah. Pain to grow and develop there. And then if the, the market just gets flooded with all these other podcasts, with these AI automation podcasts, it just like. I drowns everything out. I think it just makes the noise floor bigger. But the noise floor is still a floor. Yeah. But it's like, how do you, how, like, okay, maybe you got a couple hits on this, but it's like, are you building a brand? Are you building, are you, are you, are you building? It's, it's a click through rate thing. Uh, yeah. It's like, what is the longevity here? If you're just like, let's just keep churning out stuff en mass. Uh. What's the long term play? Also, I think the, the hyperscalers like Azure or AWS is probably like funding a lot of this 'cause it's just helping them spin up their virtual machines. Yeah, I could see that. I hear the company produces different levels of podcasts. The lowest level involves weather reports for various geographic areas. I could get how that like, I mean it's like an SEO play, uh, or simple biographies and higher levels involving subject area podcasts hosted by one of about 50 AI personalities they've created, including food expert. Claire Delish, gardner and nature expert, Nigel Thistledown and Oly Bennett who covers offbeat sports. Did they prompt ChatGPT to have witty names? ChatGPT's like why is there a whole company? There should just be one person just like spinning up AI agents to produce this stuff. I think, oh man. I think their overhead. I want the conspiracy. Weird guy. It's much like Mr. Tinfoil. Yeah. Mr. Tinfoil. Yeah. Yeah. So this, yeah, like you said, this kinda gives me the egg. It's just like the AI model. That's not the best. It also reminds you ever, you ever heard of, um, dead internet theory? No. It was like, basically it was. After, you know, like forums and other boards and everything started growing the point of the internet turning. Uh, today where most of the inter interac interactions and comments and stuff on the internet is like bots and bots commenting and bots interaction. Sure. And so that the actual like comment ecosystem on the internet is like not real people and it's just sort of this like dead internet of most of it is not. Yeah. That is statistically true. Contri like feeds into the dead internet theory where just like the, the bulk of action happening on the internet is not like people to people, it's people to robots or just robots to robots like going back and forth each other, which is not each other because, um, future AI models will train on all this data. I always wonder that from the start when like WTF man yeah, that was my always big, big question on when they started training the, like the LLMs and they were pulling stuff from the internet and it's like, okay, you pull stuff from the internet. And then you start writing blog to begin with posts, and then you start putting the blog posts that were generated up there. Right. And now you're just gonna keep draining. Yeah. Like, it's like, oh, just our, our our, our knowledge point is gonna stop at like 2020 of like original stuff. And then Yeah. Everything after that is gonna be, uh, the new training's gonna be incorporating for sure. I mean, synthetic data, I mean the, the, the lens that I will. Sort of, I see this every day. I, I work with image models. Mm-hmm. I work with ai. The minute you introduce synthetic AI images into a training set, like it really messes up. Yeah. Like, if you're not careful about it. I see you're throwing off the whole training. Yeah. I see the red of comments and stuff for people. Like, I got a data set here. It's like, got synthetic stuff and they're like, get outta here. That's crap. Yeah. No, seriously. It's gonna be crap. And you, you could tell, which like, AI models use a lot of synthetic data and, uh. The output is quite synthetic. Mm-hmm. Because of it. So I would imagine the same thing would apply for LLMs and stuff. If it's like reading through the comments of all the bots, it's gonna sound like a bot. Yeah. Oh, here we go. Okay. They, they are using agents here, the episodes themselves are built using AI powered by 184 custom AI agents who work with several large language models, including OpenAI per complexity, Claude Gemini, and more to build out the content. That's what I would've used mean also like Google. Why not 180 5? Google 180 4 is 180 4. Yeah. Is very specific. Yeah. This also feels like something that Google would just like blow up in a second. Like if they just turned a notebook lm into a, a product or a service, which I'm sure eventually Yeah. Will happen. Like they, they, they just turn it into an API and then Riverside can grab that API and then turn it into a whole platform. I mean, I think Google would just do it all in their, all full stack. Yes. Uh, I mean, and they're, they're the NotebookLM podcast. It's really good for learning information. Yeah. All right. Last one. Meta Ray-Ban glasses. So the good and the bad, I mean, overall it sounds like pretty good overall. It sounds like they got apple scared, uh, like everybody's kind of, um, this is the internet reaction that I'm seeing. Wow, those are great. Damn Apple missed the boat on that one. I mean, or they're just like, I mean, well, like we said before, they're never first. They're just usually the best. Yeah. So like, this is they, they've had the met Ray bands for a bit, you know, with, with the Ray Band I warn, and the camera. Yeah. Inside. They're really good. But these are the first ones that have. A screen built in that you can see, uh, uh, you can see a screen inside the glasses. I believe it's a tiny projector that's projecting on the right eye only. Okay. Just one eye. Yeah. Yeah. And it's, it's small. It's like 800 by 800 pixels I think. Yeah. It reminds me more of the, uh, snap spectacles Yes. That I tested at, uh, at AWE that are way chunkier in the build. I think they're supposed to be a lot smaller next year, but, um, similar where it's like a very kind of narrowish field of view. Mm-hmm. Then you kind of see. Alerts and information and stuff. It's chunkier, but it's also super light. I think it's like, uh, no, I mean these ray bands look good. I mean, it's got like a little chunky in the, in the thing for the battery? Yeah. The thing is all battery stuff. Yeah, the bridges or whatever. Um, I think it's only 130 grams or something like that. Whereas like a act like a real ray band, Wayfair is like 70 grams. Oh, okay. So it's, it's like just twice as heavy, but it's got. Compute right on it and batteries and all that stuff. Yeah. What did the, uh, I think, was it The Verge or someone called it, they said that this is the first thing that is like. Felt like what? Google Glass promised everybody's crapping on Google the first. Yeah. Well it was, they promised like 10, 15 years ago. Yeah. This is like the first thing that actually delivers on that vision. Yeah. So, yeah, I mean, what do you think about uses and stuff for this? I think a daily personal assistant thing. Uh, just notifications and managing calendars. Um, if you're walking around the street, you know, directions, like all those obvious things. Great. What. Most people don't think of glasses as is a listening device and a personal one. Mm-hmm. Uh, the speakers are like right above your ear and they're really good. Mm. Um, so I wore the first gen meta ray bands, and then I was just listening to some music and then I asked the, like the guy was literally this far away and I was like, can you hear any of it? Like, no. Oh, wow. Oh, that's cool. Yeah. So like, they could replace your AirPods. Know, whatever you wear your AirPods for, you could just wear a pair of glasses and now you have ear and vision. It's got the built-in meta AI chatbot. I can show you pictures and text your answers to questions. So it also feels like it has a bit of that smart AI element that, um, they, I think you could do translations too, like that they, that they showed with the AirPods. That would be nice. But you know, this one you can. See things and sort of interact with the world. Yeah, I mean, it feels like the best first step for like actually useful AR glasses. I have an interesting use case. Uh, you might wanna run this up by light craft. Okay. AI glass V cam. Oh, so you kind of see like a, yeah. So like you're crafting the shot instead of holding a device, you're just like kind of doing this with your head and Sure. Doing a chicken head, like a director if you finder thing where you're Yeah. You're like, you're like doing this. Your glasses you look like sing. Well, um, I was thinking, I thought you were saying like, oh, you could see, uh, like you could see, uh, the camera, the camera feed or something in your glasses. Sure. That that could be because I remember, uh, Strada, they did that test when the, when the Apple Vision Pro came out. Yeah. And then they like rigged it up so that the AC could have the, uh, viewfinder in their Apple Vision Pro and pull focus, like using that, the Apple Vision Pro. That's super useful. Especially if it's a large set and the guy's like way the hell out there. I mean, I don't think you would wanna pull focus on a 800 by 800 pixel screen. No, you pinched to Zoom, man. You do one of these. The other thing, I don't think it can, it doesn't have, um. Like spatial mapping. I don't like, I can't Oh, no, no. Right, because like that was the other thing that the snap spectacles could do. Yeah. They, they could, I they could identify surfaces and stuff and click. Yeah. It's doing slam tracking animation. Yeah. On uh, on a physical service. I don't think these can do that yet. I'm sure that's obviously planned. The other thing they announced too, which didn't really get as much coverage was they also have a partnership with Oakley and they release like Smart Oakley sports glasses. Oh, I didn't know that. Okay. Yeah, a display free. Oh wait, I thought they had displays in them. Nevermind. I thought these were. The, the other thing I would cover, I thought cover is I thought these were the HUD thing, so they, they have this like smart wristband thing that is essentially reading all your muscle contractions. Is that how you then figuring out Yeah. Control. Dude, that thing has been in research land for like 20 years. I remember in like the early 2000 tens, we were trying to use a lot of that for motion capture.'cause we could just eliminate finger. Gloves and all that. Right, right. And just like capture, and it was not quite there yet. That, that's also why like when Apple finally does release something like this, it's probably just gonna like blow everything else out because they're already, they have like the groundwork.'cause like Apple spoken like a true fan boy. They're gonna blow everything out man. Are you sure Joey? I mean, I'm a hundred percent sure, but I mean, they have a pretty good shot, but because they're, they have the Apple watch. That Apple watch already does, can track pinching and gesture control. Yes. They'll have the glasses or whatever. They'll have the like phone with the crazy processors built in, connected to it. Because it's also like it, this only a lot of this stuff only works with your phone nearby.'cause the phone's doing the processing on the medical, on the uh, the meta glasses. Yeah. So look, apple has the groundwork and they have the, what would make you want to get one of these right now? Like what would I want would make Joey go spend $800 on this. Yeah, that's a good question. I dunno, I, I, I think it, it would have to have, it would have to, it need to have some sort of like, ar uh, and like, like spatial mapping or something. Like, it would have to be something like, I don't need a screen in the corner of my face. To like, to see like a text message. Like I don't, I don't care. The weather. Yeah. Like, so I don't really care enough about that. Yeah. It would need to have like something where it could, you know, plop displays or graphics or on the wall or on the, uh, table or if I'm like walking around the city, like actually like show arrows and like map directions and stuff. Uh, would you use the, it would need something to interact with world, the cameras at all? Maybe. I mean, the camera thing has been kind of appealing for just like, oh, it'd be cool to grab a quick shot of this, or something of Um hmm. Like a POV kind of shot. I can't, I can't think of any. Professional use memories. Professional use. Yeah. I mean I can't, like aside from virtual location scouting or something like, I imagine like a police officer would have, I can't think of where headsets have, even police officer would have great use for this. Like instead of body cam, in addition to the body cam, they have this camera and then, um. Whatever, license plate information. Yeah. Could pull information. Like the stuff they would go back to their car for the computer on, but you need heavy AI integration and all that included with it. Yeah, and you would need like fast processing and data connections and stuff like that and a battery that, or if you're, or working at like a Amazon warehouse or any kind of big industrial facility, then you know you have relevant information. Mm-hmm. Pulled up. With vision and everything. I don't know. Yeah, it's been a tough, it's also got a little tough for me too.'cause I don't wear glasses, so it's not like it, it'd be like, it's a, it's a, it would be a new, like beyond just like the tech decision, just like a personal style slash you know, changing my day-to-day wear to wear glasses. Yeah. So, yeah, it's, it. I hate glasses. That's a good question. That's a good question. I mean, and they're, these are heavy. Yeah. The pros would have to significantly outweigh the, the nuisance of wearing glasses every day. Yeah. I mean the Apple Vision Pro. I've use it a few times, but it that's, you know, that's such a different, like that's is something where you could theoretically be traveling or something, or, or, or bring up a bunch of screens and change the way you work a little bit. Yeah. As smart glasses are. Yeah. The other like, um, more bit different watch smart watches for, I don't know, long time now since the Gen one Apple watch and uh, now I can live without 'em. Two main reasons. One is all of the watches that I used to have would drift in time. Over time. Yeah. So like even like a minute off is, you know, it's too much. Yeah. These things are obviously all synced. Everything synced. Yeah. The second is I kept missing, missing meetings 'cause I never would get the calendar invites and stuff. And now I get the meeting invites and all the, yeah. So now I feel like I look like a professional person, but this thing's helping me do that. Yeah. So I think if those two things were put on the glasses, then I could maybe not wear the watch. Yeah. Uh, I mean the fitness thing, I could see that for gla be, be useful for glasses. Um, yeah. So I could see stuff without having to pull my phone out or see like metrics and stuff. Yeah. Um, that would, that would be useful. Okay. It, it is such a big lift where it's like, it feels like a very dedicated like. Full-time athlete kind of purchase. Yeah. We're just like, eh, it's like a nice to have, I mean, the, the, the tech companies want us to wear the watch, the AirPod, the glasses. Yeah. And then the last thing is they jack us in with Neuralink and we're we're just toes. Just like, ugh. Maybe that's the thing with the matrix and what the matrix got wrong was that like the robots forced everyone Yeah. To plug into a battery. In reality, it's just gonna, we're gonna voluntarily plug ourselves in a hundred percent. Like that, that world is way better, you know? Yeah. You're just gonna have more fun. Just, just, just, just, just, yeah. It's gonna be more like, uh, ready player one oasis. Exactly. Like we're, we're voluntarily gonna go there. Yep. All right. So yeah, that's a roundup for today. All right. We'll be back next week with. The AI roundup later in the week. This week I have some shout outs. Uh, so Spotify's been getting some, uh, cool comments. Uh, those of you who are listening or viewing on Spotify, thank you for doing that. Paul Tarpon, thank you for your comment. Ola Lee 92, thank you for your comment. And again, if you're just new to this, you know that little rating, if you give us a little five star rating. That goes such a long way, please consider doing that. And yeah, also shout out to, uh, Robert Flowers TV for, uh, was your commenter on YouTube. And so, yeah, thanks for the action. Thank you. Uh, glad you have, uh, he commented about Comfy Cloud, so Oh yeah. Yeah. I'm excited about seeing that coming too. Joey and I were just talking about having more Comfy UI episodes, so we were, yeah. So yeah, if you've got anything specific you wanna see, we will, uh, try to break it down. Uh, links for everything we've talked about@deistpodcast.com. Thanks for watching. We'll catch you at a roundup this Friday.

People on this episode