Denoised

RenderCon, Higgsfield AI, and Vimeo Streaming

VP Land Season 4 Episode 24

Addy Ghani and Joey Daoud dive into RenderCon's standout moments, including OTOY's Star Trek Unification project that digitally recreates Nimoy and Shatner, analyze Higgfield's impressive new AI video generation capabilities with its 49 pre-built camera moves, and examine Vimeo's strategic pivot with their new streaming service that allows creators to launch their own Netflix-style platforms.

In this episode of Denoised, we're gonna recap RenderCon, the new AI video model called Higgsfield, and Vimeo's new product to help creators launch their own Netflix. Let's get into it. All right, Addy. We're back, at the round table in person. That's it. Good to see you again. Good to be back. As much as I love Riverside for remote recordings, I don't miss doing this podcast remotely. Yeah, I think we've set a really important rule in the beginning that we'll do most of it in person. Mm-hmm. Now that you're moved to LA uh, this is really good for a natural back and forth, a natural analysis. Lemme just freeze here so I can pretend that the, uh, connection is interrupting again. And did you get what I said I was trying to catch up to? Alright, lemme repeat that line. I think it's good for a natural interaction. Yeah. So much over the conversation the last few episodes was like kind of hearing what you're saying, but just like nodding as as if I heard everything. Yeah, it, it, it's, it's also really interesting because we talk about cloud tools, we talk about remote collaboration and decentralization. Riverside, I think, is the most popular podcast platform for remote stuff. Mm-hmm. I love Riverside. It's still not a hundred percent there. They're also in their defense. Too. A lot of it too was like at NAB, I was on my phone's hotspot, so it wasn't the best internet connection. Yes. Tools, they're the best. And the other thing is like even if the connection in real time is crappy, I know that it's recording locally, so I know the files are gonna be as good as it can be. Yeah. All things considered up being remote better than, yeah. I mean it totally Zoom or teams or something. Yeah. You know, there's still some stuff left to be desired, but that gives a new player to come in and dominate. Maybe, maybe. Or it could be, uh, you know, it'll be holographic or spatial or something like they did shoot that James Cameron interview that, uh, we talked about last week. There was a spatial video. Well, it's meta, so I don't think they call it spatial video, but whatever their flavor of spatial video is. It was 4D, like a volumetric capture. It was a volumetric capture, yeah. Version of the interview. Okay. Which I did try to watch on my Quest two headset. Oh. And made me kind of nauseous. James Cameron in full 360 volumetric. All right. So I attended RenderCon Yeah. A few days ago, which. I was a little bit fuzzy, well for it after NAB, and it was on my calendar. I was like, wait, what is this thing I signed up for? But yeah, I was put on by the  Render Network. Can you gimme some background on the  Render Network and OTOY? Yeah. I've never heard of the  Render Network before this year. This seems to be the first in-person conference, so I think this is sort of a compliment to something like GTC or even SIGGRAPH. Mm-hmm. It's very computer graphics oriented, technical audience. A lot of the big graphics vendors are there. You know, NVIDIA's there, OTOY is there. And you even have graphic celebrities like Beeple. Uh, we all love Beeple. Yeah. Uh, somebody asked, yeah. Uh, one of the guys, uh, ran from corridor quarter Crew was there. Yeah. Yeah. Oh, awesome. Which one? Nico? Ren. Oh, Ren, okay. Yeah. Yeah, because it was a lot of. I think the common thread was a lot of people who use the  Render Network, which is their other product that's distributed rendering, like GPUs on demand, you got a huge thing to render. Oh, okay. A good example is, uh, Alex Pearce, Sim-Plates. Yeah. Uh, gave a talk on virtual production, uh, because he's a user when he needs to render out his 16 k crazy large driving plates. Render Network is what he uses. When it would either be like, there's just no, not enough GPUs that he has to do that. Or it would take like a crazy amount of time and it's like, oh, you need to, you know, have a thing you need to render out. Send it, you know, spin up your GPUs. Yeah. Uh, send it there to the  Render Network. I think it's way easier than setting up a, a render farm on AWS or something where you still have to like, dial in the machines and the applications and everything. This is like a render farm. On demand. Mm-hmm. For anybody. Yeah. That's pretty cool. Yeah. That's like the background of the company, but the event itself, as you said, kind of being like at GTC Yeah. Or at SIGGRAPH. It was at Naya. Wait, uh, is that a studio? Nya Studio. Small studio. Yeah. Yeah. In Hollywood. And you said that it was, uh, it was rather, uh, it was extravagant. I mean, I don't know. Extravagant maybe is the wrong word, of like, it was extremely well run and put together. Uh, very nice event. It did feel like. Some, some points. I was wondering if there were more event people running it than actual attendees. They're just like people everywhere. Like even at breakfast it was like someone saw someone. Yeah, with like a platter with like little yogurt things like appetizers on a platter. Nice and baristas and bars and stuff everywhere. It just really nice One event, there was like a gift shop where you can kind of customize your own t-shirt, the main event space. Super nice. And then the movie, there's a movie theater that was part of Nya Studios. Okay. But super nice theater with like super plush comfy chairs. Yeah. And they would like serve you popcorn when you went in. It was fun event. Really nice. That very nicely, uh, put together. Yeah. I wish I, I would've been able to join you. I was in that shoot like that whole week. Yeah. I didn't see much sunlight. But it seems like the people organizing RenderCon are well versed in organizing enterprise trade shows. Mm. Mm-hmm. So the funding was clearly there, you know, and I'm pretty sure we're gonna see one next year. Yeah. So RenderCon was put on by Jules Urbach, uh, CEO of, uh, OTOY. So yeah. Can you give us some background on, uh, OTOY? Yeah. OTOY is really well loved by the computer graphics VFX community. You know, long before Unity and Unreal we're really a big player in our space in m and e it was really the OG Renderers, it was Octane, which is an OTOY product, Vray. Arnold, these were traditional path tracing renders. The stuff that you hear about, you know, one frame takes 10 hours to render. Mm-hmm. This is what OTOY has been doing for a long time. And, uh, it's, it's a really high quality render used for final pixel. It competes with all of the other competitors in its space. And, uh, it's interesting to see they have diversified their business model with the rendered network. Mm-hmm. And that there's still a very strong market for heavyweight offline rendering that takes hours and hours and hours. Yeah. And then a lot of the talks or two where it's like, what's the future gonna be? Especially with ai and it's like rendering versus, yeah. Generating. I think, uh, my speculation here is that all of the renderers know that the future is in neural rendering. Mm-hmm. So using AI in some form or fashion to simulate a path trace, or create shaders on the fly, or compute, you know, the lighting calculations with ai, whatever gives you the most efficiency. The market will instantly absorb it. What I mean by that is if the rendering is better and it takes half as much time, they'll just render twice as much. Like put twice as much complexity in the scene and so on. Yeah. Yeah. That was a thing that came up too, where it's like, oh yeah, as the cards get more powerful, you get more RAM in it. It. It's not like, oh, this is gonna be faster. It's like, well, what else can I throw at this thing? That's it. And I, I don't think we're anywhere near the level of where we're fully simulating reality. Right. Like, uh, a basic water simulation or hair simulation like can completely overload a render job. Like you could take it for one hour to a hundred hours or whatever. So, uh, there's a lot of progress to be made. Neural rendering, I think is a really promising new method that I hope that they're all exploring. Yeah. Yeah. And that came up a lot. And yeah, just another big common thread was modeling. The real world and leading into basically the holodeck and like the, um, the real time Doom video game generation example was brought up where it was like a version of Doom, but it was like generating in real time. So as you're moving through the w on the fly, it's just like generating, taking that, like that's a very like low-fi example, but taking that. To the future mm-hmm. Of being able to create a holodeck world. Yeah. Where you're kind of living in A 3D or e experiencing a 3D world. Yeah. But it's generated in real time. Yeah. And, uh, whatever the story or your will or method, and it kind of just starts to build it on the fly. Mm-hmm. That's, that's really interesting. Uh, I heard the, uh, Star Trek folks were there. So yeah, Jules is a big, uh, Star Trek fan. Oh. And they had been doing work in the past with the Roddenberry State and digitizing some of the enterprise models from like the original movies and creating digital versions of it. And they created a, I think. Full scale 3D version that you could navigate Of the enterprise? Yeah. Like a life two scale model of like, based on all the maps and stuff that they had of like, what it would actually be like, uh, to explore it in a, in a real virtual space. And they had like a, a vision pro experience where you can kind of navigate through the enterprise. That's cool. And then they did a, a Star Trek short film called the Star Trek Unification. Okay. And it was sort of a film. Sort of gimme an ending to, uh, Kirk and Spock, who, like Kirk did have like an on scene death in star shark generations, but Spock never really had a like, kind of death scene in the canon of, uh, yeah. The movies like, and the TV show. I thought the one with Chris Pine, the newer one, Leonard Nemo does pass like in, in. I think they're, I think they're going off the original. Oh, okay. Okay. That timeline. Got it. Um, but it was a short film that was sort of like a, you know, kind of recap, closure for the Kirkin Spock story. Yeah. Or a goodbye for like, those two characters to have a goodbye. The issue is in making this William Shaer is, uh, a bit old and letter name wise has passed away. Yeah. Who played Spock and so they worked with the estates and Rod Roddenberry. Uh, who they've been collaborating with for a long time. Yeah. Who is, uh, gene Roddenberry's, uh, son. Yep. And kind of matches the Roddenberry State and Star Trek. And, you know, con continues the legacy. Worked with all of them and got chatter on board and the NEMO stayed on board to basically shoot. With other actors and deepfake, for lack of a better word, digitally alter fake face replacement. Yeah. With, uh, different age versions of William Chat, Norris Kirk. Yeah. And Leonard Nimoy as Spock throughout the years for this short film. So OTOY has built a solution that is essentially like what Metaphysic AI is doing. Yeah. So they were able to build a real time face altering solution similar to what we've seen with metaphysic and here where. In real time looking in the director's monitor, they're able to film with their actor who's like, you know, physically and kind of performance wise matches. Yeah. Kirk, but you know, doesn't exactly look like Kirk. Right. And replace the face and like on set so everyone can see what it's looking like. Wow. That's really interesting. Uh, you know, it just kind of makes me think about all of the golden ips from the last century. Like you have obviously Star Trek. For what, 60, 70 years now? James Bond going through like a exchange of hands right now. MGM, Amazon has it finally for full creative control. Mm-hmm. And Star Wars having kind of a full reset with Kathleen Kennedy. Mm-hmm. Leaving so. The question is, are they gonna put in new cast or are they gonna digitally swap out and keep the OG cast? Right. Or do you blend it too like we had with in, uh, Mandalorian with like Luke coming back. Yeah. You know, young Luke from that timeline, which also the face swap on that was questionable. And then there was like that fan version that came out that, that was way better, the face swap. It was like, yeah. A lot, a lot better looking. It opens up a lot of possibilities and it also opens up a lot of. Questions too.'cause you look at something like Leonard Nimoy where, you know, he's passed away, so he didn't really have any say in this. It was up to his estate to kind of determine, you know, if they wanted to go forward with this. Yeah. And then also just the process of creating that, the digital version of him. Yeah. Um, talk me through that. Yeah. I believe, I mean, I, I, I think they have a lot more resources online as just sort of some of the highlights from the keynote. But it was like they built a clay version of his head based on existing images and models that they had. Mm-hmm. And then took the. Physical model of his head that they built. Mm-hmm. And scan that, scan using the photo scanner, uh, 360 rigs that I believe they developed for cases like this. For digitizing Yeah. For people. Yeah. Yeah. And so then they use that as the digital model that they then were able to, uh, face swap out on the actor. Yeah. There's a lot that goes into the sort of face, digital face replication, because once you have the model that's just like the first entry point. Then you have to reto the face to have blank shapes. Mm-hmm. Then the blank shapes have to be driven by facts, facial action coating system, and you also need rigs for secondary motion. So bones within the face that jiggles the fat a little bit here and there causes a wrinkle that otherwise wouldn't occur, but. Exactly what Leonard Nemo has on his face. Mm-hmm. So my guess is that portion is still very much a high-end VFX artist pass at building the facial rig and then finally inputting that into like a AI solver. Yeah, I'd be really curious what their pipeline is or what they can and how that differs from what we've been seeing with metaphysics into Yeah. Fear and like, you know, a few different companies developing their own. Pipelines that are getting, you know, to the same similar end goal, but like, I'm curious how it differ, differs with like how they do that. The other issue, you know, or kind of topic that came up a lot too was specifically with this example and you're kind of using, you know, the likeness and digital versions of actors, you know, somewhere. It's just like William Chat, you know, just has aged out from like, acting that, but it's still on board with it. Yeah. The, the rights. And Yes. You know, the actors having the rights and, and Ari Emmanuel was part of the, the, the keynote. He's an investor in the Reman, the Ari Emanuel super agent also, uh, portraying inspiration for the entourage. Yes, Ari? Yes. And this is, you know, a thing that's coming up now. He, I think, believe he was a advisor investor in OTOY, like they've been collaborators for years. But yeah, with. Who he's representing and just as an agent and the actors. Yeah. And, and having controlling the rights to their digital likeness, and also how to monetize that in the future. Yeah. Especially for, you know, your family or for in estate, like the Nimoy estate where it's like, okay, you know, he has passed, but his likeness and, and, and, and digital, whatever the package of that rights would be called still has value. Yeah. It's a goldmine for somebody like our Emanuel or any agent, right, because they can essentially create a never ending flow of income. For the estate if they do everything right. Mm-hmm. And then. Somebody like the Roddenberry still continues to develop new Star Trek products. Mm-hmm. And then those things are developed and successful and then that drives more use of digital human rights for the agents. Uh, I think it's an interesting ecosystem to watch, especially because. Uh, you think about like where we are today in Hollywood, right? Like all of the big stars that we talk about are really the stars from the last decade, two decades, right? Like your George Clooney or Brad Pitt, Sandra Bullock. Like these are not stars of today. These are stars of yesterday that still have a massive audience and hold a big sort of prestige over their names. And, uh, the agencies are probably thinking, well, how do we keep. This income revenue going for both them and us. And so they're probably super interested in all of this work. Yeah, yeah, for sure. And I think from the actor's perspective too, there was another example of, you know, having to scan the same people multiple times. Yeah. Like re-scanning them for a new project. And I think it maybe said it was the Rock was one example where they've like scanned 'em three or four times. For different projects. And from his perspective, he's like, why do you have to keep re-scanning me Like you've done this before? Yes, of course. And yeah, so it's like I, is it a, a matter of just like, because who owns the rights to those scans or it's tied to the project? The technique changes. Yeah. Is there like a baseline where it's like the actor could go in and be like, I'm just gonna capture myself now I own this, and then I, like, if I'm on a production, I can like license it out. Or something and they need these scans. I don't know. I'm kind of curious like, yeah, from a technical end or just like, I think the scan quality has gotten a lot better over the last few years. You know, it's not like something you could track, like the resolution of the camera. Mm-hmm. The camera resolution is probably more or less the same, but the actual solve. You know how you take 2D images and create something 3D mm-hmm. That has a lot of computational algorithm behind it and maybe now there is a better version of that that is more accurate.'cause you gotta get down to a level of accuracy. That's like your pores, right? You're talking sub-millimeter accuracy. And maybe that wasn't achievable five years ago, but is now. Mm-hmm. And the other thing is, it's not just about the geometry, it's about the material. So like our faces are essentially subsurface scatter models, shader models, and so you're capturing reflections. You're capturing light that is still left in the skin and all of the little imperfections on our faces. So there's, I think we won't see the end to the rock being scanned over and over again. I think five years from now, the Rock will be scanned once again. And I imagine too, it's like if you're doing it for that project, you want, they're scanned to match the, the pipeline, but also what you were shooting in production with them. Yeah, too.'cause like, you know, we change, we age, we, you know. Yeah. Like, so I could see there's like one version where it's like, yeah, the actor would wanna own the version of like, lemme just capture me. Now in my thirties at age, my forties. Yeah. Like I have this 20 years from now, we ever need to come back or do like a younger version of me. I, I have you have it? My, yeah. 20-year-old model version of me that I can monetize. Yeah. You know, the future. The future. I just hope that The Rock has some sort of crazy backup hard drive where he is storing all this. I'm just kidding. He probably is no idea. No, but it's somewhat from his teeth. From uh, yeah. What Two buck or what's what, seven buck. Whatever's put up. Production company's called. Yes. Yes. Uh, also like, um, you know, Fast and Furious is probably an IP that's just gonna keep going. And that was beyond the lifetime of in Diesel Rock. That was from the early cases too, when this Yeah. A whole thing kind of came up with the early passing of, um, yeah. Uh, Paul Walker. Of Paul Walker, yeah. Yeah. And then, uh, he, there's another, uh, branch Hobbes and Shaw that NBCU has. Mm-hmm. I mean, as the Rock sort of ages out, they're gonna continue to keep putting him in the movies. I'm guessing even though new actors will come in and kind of take on the primary roles, you know, and what if they need this time capsule video of the Rock when he was 25 years old? Yeah. And Star Wars is another good example of this too. Yeah. When they keep making new series and bounce around they he'll timeline of stuff and it's like, oh, you know, we wanna bring back a character, you know, from a new Hope, you know, 50-year-old film and you know, the actor's not around anymore, but it just gives the writer so wanna bring the character back So much more options. Yeah. When it's like just because the. Actual actor passes away or is like ages out. It doesn't mean the character has to disappear. Yeah. But also at the same time, the antidote, I would say the counterargument is like never underestimate the power of good artist history. Mm. Like an artist can completely make the rock from scratch at any age. It's just a matter of time and money. Mm-hmm. Yeah. Yeah, for sure. Yeah. And that's like another one of those like, uh, clips that keeps going around the internet of Pirates of the Caribbean 2 with, uh, the Davey Jones. Yeah. With the, um, face. Yeah. And the eye, the oc the, uh, the squid face and the detail and stuff is like, why does it not look like this squid anymore? It's like, well, that's because they had a lot of time and money. Yes, exactly. And like stuff is a lot more rushed and budget was cut, but yeah. Yeah. I mean, RenderCon, uh, overall just great event. I mean, these were just like two of the talks, but it was like talks all day. Yeah. Presentations all day. So, yeah. Uh, I thought it was an excellent event. Hopefully do it, uh, next year and yeah, excited. Also, one side thing, because of the Star Trek thing, they had a whole separate Star Trek. Kind of museum exhibition. So they had a bunch of the costumes that they used from the film and from just other Star Trek projects. Mm-hmm. So as a Star Trek fan, it was fun to uh, ah, man, to see that. I love Star Trek. There were a good handful of people there too, who I think. Seemed to have known that it was gonna be a very Star Trek focused conference. So they came, uh, fully dressed up in various Star Trek era. Really? Yeah. Uh, my favorite Star Trek was JJ Abrams, Star Trek. I don't know. Is that blasphemous? Is that bad to say? I don't know. Are we gonna lose some viewers over this? No. Come on, man. It's gotta be the original series. Okay. Or our next generation. All right. Next up Higgsfield. Yes. Higgsfield is a new video generation model. We see one every week. Right. So what's so special? What's so cool about this one? Well, this one has the, uh, backing of Alex Mashrabov mm-hmm. Who was previously the. The, uh, AI lead at Snap. He sold a company that AI factory for 166 million, uh, back in the day. And now because he has that leadership and the technical capability, he built a new video generation model from scratch that is truly built for mm-hmm. Cinematic control and cinema. We talked about Asteria and Moonvalley a little bit. How that model was purpose built for our industry to have the highest quality output. Higgsfield feels like a competitor. Not so much to like a Luma or you know, a Pika, but rather asteria and some of those cinematic models, quote unquote. Yeah. I mean the, the stuff, I haven't had a chance to mess around with it too much yet, but the stuff I've been seeing online, the clips look very good. Like, very like realistic physics. Yeah, very realistic looking. And that was what they said their differentiator was. What they trained their model to do was to train it more on cinematic lighting, camera movements. It has 49 different prebuilt camera shots. Amazing. Built into it to kind of like, uh, queue up. And, uh, on the, sort of underneath the basic architecture, it's really interesting because it, they're calling it a diffusion transformer architecture. And those are two completely different things. Transformers are typically associated with oms, like, uh, ChatGPT, Uhhuh. Diffusion models are associated with image generation, like Stable Diffusion. Right? So they're combining those two things in a hybrid architecture. I would love to see like a little bit of white paper on some of the under the hood stuff. Yeah. Maybe we're break that down in the future, but yeah. I'm curious, yeah. What that unlocks and how that changes, how this is able to Yeah. Guess perform better. My guess is, uh, you know, when you combine it into hybrid architecture, purely speculating here, that the prompting will give you a much more accurate result. Because it's just so tied into LLM transformers underneath, like it's better able to under the, the direction and the language better. Yeah. Yeah. I believe it was also trained on, I think in the training process, they had the clips analyzed by Google Video Analysis. Okay. To tag all the clips that they then fed into like their training model. Okay. I don't know if that Yeah. Is better or where I, I don't, I don't, I don't know as much transparency about how the other models are trained.'cause you have to like. Analyze a bunch of footage, tag all the footage to like be like, this is what this is. Yeah. And then train it. I don't know if that unlocked better control by training it on different languages of like lighting quality, color, quality, shot design. Yeah. Stuff like that. I'm not sure. But, um, I, I mean we're both seeing a lot of stuff out of Higgsfield online. Yeah. And it's not anything you should be ignoring. Like it's good stuff been hearing. Yeah. Like a lot of people have been switching to it. So the thing that came out recently, like, uh, yesterday, that caught my attention. That I thought was interesting with what they're doing is they're calling it Pulse, but it's basically pre-programmed camera moves or like, have been in a bunch of different AI models now. So like his field had the 49 different camera moves that we talked about. Yeah. But like, uh, runway has camera moves. Ray, uh, Luma AI has kind of camera moves where, you know, orbit or pan or zoom just to like Yeah. Moves that understands that stability has camera moves. Yeah. Virtual camera. Yeah. Yeah. What they came out with it with Pulse, it is more about human movement. Mm-hmm. So. They have a pr, uh, like a handheld no more, like what is the subject doing and understanding that movement and it's trained on that movement. So for example, baseball, like baseball hitting a bat. Yeah. So it has that movement already programmed in. So if you like have, give it an image of someone with a bat and you want them to hit the ball, which can sometimes be, if it doesn't understand the physics of how that's supposed to play out. Yeah. But now it's specifically trained on that movement. And you can call up like, oh, this is what that movement's supposed to be. Okay. You get a better output. Oh, that's awesome. There's some other examples. Skateboard glide, skateboard, Ollie, ski carving, ski powder, basketball dunk things where, you know, we know as humans know what this should be, but if you give it an image, like in traditional or in older AI models, you give it an image of like someone, uh, with a basketball like. Approaching the basket. It doesn't always know like, oh, it's supposed to, you know? Yeah. Complete the movement and dunk it in the basket. Yeah. And also just like holding the basket with, uh, basketball, with fingers like, uh, object interaction mm-hmm. Is still lacking. So this is really interesting. I, I think it's a big, uh, this is the first one I've seen where it has programmed or it has training specifically for the human movement. Yeah. We've seen a lot of the camera movement. Yeah. But now to be like, oh, this is what this, you know, how this should interact in the physics with this, I wonder if under the hood they're running some kind of like motion capture model or ca like a character rig model where it's, uh, simulating an actual skeleton or they're just able to give it like maybe based on better tagging data. From their training, they're able to give it all of the like snowboarding footage and be like, this is what a snowboarder should do. Yeah. And that goes back to being a transformer and not a mm-hmm. Diffusion model.'cause transformers can correlate between action and image much, much better, I think. Mm-hmm. Yeah. I'll have to test on Higgsfield. But one issue I had last year that no model could do was finger pressing a button. I could not get a finger to press a button. And I tried like every AI model, this is. Six months, seven months ago. Yeah. And it would be like, you know, I, I had a big shot of a red button and I went, I'm like, I just wanna, you know, figure to just press it, like slam that thing down. And it would be like, it would rub it, it would like caress it, it would like, it would do everything but actually press the button. I tried to like runway. I tried pika, I tried every single model. Could not do it. Yeah. I needed to do an updated test on the button press. Uh, and I posted about this and some of the other people were like, yeah, you know, just button press. It's like, you can't, like, they won't do it. They don't know. They won't do it. They, they don't know what that means. Shit. That's great. That's like the ultimate AI test to see if it's real or not. Um, so I don't have to, I'm gonna have to do an update on this test with Okay. The newer models and see if, um, that got fixed. Actually, I'm gonna test it on, um, we talked, we talked about this a few weeks ago, but Veo 2, the, the good Google model that was sort of in like a closed beta. Now it is. You can access it in Google AI Studio if you have a Gemini. Subscription, I believe. Yeah. Which, uh, if you're on Google Workspace, I think they forced everyone to have a Gemini subscription. So if you Right. I'm on it. Yeah. I messed around with it yesterday. Yeah. And I will say, what, I gave it a shot that I had tested other AI models with, and it was like a person holding a piece of paper and the paper had text on it. And that's always kind of problematic. Yes. With the ai because it'll does not very good with maintaining details. Yes. And it'll like jumble the text or like. Blur it and make it totally, totally unstoppable. The VO two held, it kept pretty much everything sharp. Nice. Really? Um, yeah. Yeah, kind of. And that's where like, uh, ChatGPT-4 oh image generation excels at is text. Mm-hmm. Text has been historically hard pushing buttons and texts, man. Yeah. Yeah. So. Veo 2 is more available now. You can mess with it. So I'm gonna have to do a vo Veo 2-Higgsfields button. Press, uh, show down. Alright, maybe you can make a YouTube show just on that. Yeah, push the button. Push the button with, uh, some Daft Punk going on. All right. And then the last story. Uh, so Vimeo announced a new service that is basically, they'll be the backend to help you launch your own Netflix, I mean, basically your own, uh, streaming. Yeah, that's service. Uh, I think it's really cool on paper, uh, as far as like actual use case and how many customers that actually go for this, we'll be interesting to see, but in theory, if you're a big enough content creator. That have created your own universe. Maybe I'm not gonna take Mr. Beast as an example, but someone a little bit smaller and you have a hundred thousand or a million viewers. Technically you can create your own application on a, you know, on a Roku or whatever, and then have Vimeo, uh. Infrastructure host that entire thing. So from a navigation menu to individual playlists and everything outside of YouTube, outside of Vimeo. When I first saw this, it was a bit confusing because this doesn't feel new for Vimeo. They had a product called, this one's called Vimeo Streaming, but they had a product before called Vimeo OTT, what? Uh, over the top, which is like, you know, the industry term for um, yeah. Mean stream, yeah. Uh, apps and platforms. Yeah. So yeah, I was a bit. Head scratching was like, are you just rebranding to make this clearer? But the same issues from the Vimeo OTT you apply here, where it's like, these aren't basically vimeo's gonna be the backend to host your videos and do all of the app building stuff. You know, all the hard stuff. Yeah. Where its like you could either hire developer to build this from scratch. Expensive, or they already built the, the bones for this. Yeah. And you know, I mean, it's all the platforms are pretty much the same. You have playlist, you have a home screen, you have, you know, your banner art. You just need to upload your stuff and post the videos. Which video does the part here that's the hardest is the CDN, the Customer Delivery Network. Mm-hmm. Uh, Content Delivery Network rather. Uh, and that's what every, like, everybody major in this game has a very robust CDN. YouTube has their own, Netflix has their own, Verizon has their own, yeah. Vimeo has their own. So you're just leveraging this powerful CDN for your own version of Netflix. Yeah. You're related to video. Do all the Vimeo, do all the, all the, the backend, all the back, like complicated stuff. Yeah. Network routing, cloud storage, uh, play on demand resolution based on quality of service, all that stuff. That's difficult. Yeah. But yeah, the thing where I get stuck is like, there's like, how many creators? Where does this make sense for? So basically you'd need. You'd have to be a big enough creator to have a big enough audience Correct. To get, you know, one to 2%. Yeah. That would convert to pay a subscription. Yeah. That would want this extra material. Right. And that it would be worth it for you to pay whatever. I don't know what the pricing is, but I remember Vimeo, OTT, I mean, I think the very base level was at least 500 to a thousand dollars a month. Yeah. Plus probably the two to five to $10,000, like initial setup. So you have to have enough money to invest in it, enough of an audience to convert, to pay 10 bucks a month, six bucks a month, whatever that is. Yeah. You know, I think the ones that maybe make sense for like the how the lifestyle ones like yoga with Adrian. I think if she wasn't on Vimeo, she was another marquee person on one of these other companies that does a similar service. You know, yoga, daily yoga videos. Yeah. And workout videos like that Makes sense. I think another example is maybe it's not an individual creator the way we're thinking. But a collective. Yeah. You know, that's a good idea. Yeah. Like the example I can think of is there is a collective of educational, really sharp, really cool content, uh, called Nebula. Mm-hmm. Yeah. They're on YouTube. Were, but then you have they shoot, which is the same people that started Discovery Channel, maybe Regional Discovery Channel. I think that was, yeah, it feels like when discovery was educational. Absolutely. Yeah, like TLC and all those. It's good for like kids to watch and learn about space and Uhhuh, you know, even somebody like me, I love learning about like World War ii. Lots of great content. So Nebula probably built their own hosting platform from the ground up. They've been around for a while, but the next Nebula can just leverage Vimeo streaming. Yeah, you would just have to pool together. 10 or 15 high-end YouTubers and that, you know, that's actually a pretty good idea. That's a way that a lot of newsletters have gone where it's like a collective of five to 10 online writers who have like pretty decent audiences. Yeah. But maybe not enough where they can charge five, 10 bucks a month for the content, but pool all of them together. And make it like a package deal. So if someone subscribes to the company Yeah. And they have the option to subscribe to the individual newsletters, it kind of pools together more of these resources. Yeah. So that it's more valuable package to sell. Yeah. Then you can justify the $6 a month deal. Like I'm getting, you know, the 10 different creators, not just one creator. Uh, I think another good use case for this is a lot of the smaller podcast networks. Mm-hmm. You know, like you can have like six comedians with their own podcasts. Pull it all together and then host, host it like podcast comedy. That's a little app. Yeah. And maybe between that, you can have enough to have like a weekly special a week from someone where it's not like one person has to have some original thing every week, but like amongst everyone, it's like once something original every month or every other month. Yeah. The issue with Vimeo is always been that they are the sort of very distant second to YouTube, right? Like, nobody even aware. I don't think they're that. Same. Yeah. I mean, I think they gave up on that race a while ago. I mean, yeah. I remember in college, you know, when like Vimeo video streaming was like a thing. Yeah. And then you make your short film. Yeah. And it was like, do I put my short film on YouTube? This weird, you know, CAD video plays, or do I put it on Vimeo? You know, where it's like that's Well, real artists Cinema post or stuff. Yeah, they have the staff picks and stuff, which still exists, but, and I still think Vimeo's player is. Much nicer quality. Yeah. But yeah, they, they lost the, like, online video streaming, like discovery game. Yeah. You know, I mean, now they're a backend, I, I think a huge part of their business is just the, uh, enterprise customers and like being the video platform for like backend for huge corporate customers. Yeah. It's, it's more of a B2B business now. Yeah, I mean I use them for like hosting videos on my website'cause I want a nice clean player. Yeah. As far as video discovery, I mean, I probably even mean aside, excluding short form video platforms. Even Rumble, even like rumbles probably like, has probably bigger video streaming numbers publicly than, uh, than video. You know, I think, uh, on this show, we cover a lot of creator economy building blocks. So this is one of the other things that is gonna be cool to watch because we're looking at. Hollywood being sort of dismantled and then rebuilt in a new package, a new format, if you will. And, uh, looking at just all the little pieces of technology that go into building Hollywood today and what Hollywood or sort of the next generation of content will look like in the future. So Vimeo plays into everything that we've covered on the show. Yeah. And I will say the one thing that has going, that's some good momentum Yeah. Is the stat of like, more people are watching YouTube on their tv. Yes. And so that is a good tailwind to kind of excel the Vimeo streaming.'cause it's like, okay, great. Like now that is a good spot where content creators can have their content, uh, easily watchable on a tv. Yeah. And, uh, the other big, uh, crazy, uh, fact that blows my mind is Netflix is a giant in our industry, right? Mm-hmm. Like they, so. They're obviously making a lot of revenue and they claim to not spend enough on content. Like there's still bandwidth for Netflix to spend more money and make more content. Like that's how crazy this company is, how ambitious it is. I. Even having said that, YouTube is surpassed Netflix on revenue last year, so YouTube is just bigger than Netflix. Yeah. YouTube also has a massive ad business too, which Netflix is trying to build up, but Yes. Yeah. But it just goes to show like where the eyeballs really are right now, but also in the YouTube money camp. YouTube also does divvy out a ton of their revenue to creators. Yes. Whereas Netflix. Once they buy the stuff they got, they own the stuff. Yeah. And, and also that kind of helps proliferate more creators to come online, make more content and it just kind of feeds the system. Yeah. I have all of the platforms like that anyway, like Oh, uh, publicly acceptable, uh, accessible platforms like YouTube is the best at divvying up making money. Yeah. Of where having, where like the ad revenue from YouTube can be substantial. Yeah. With a big enough audience. It kind of, kind of plays into why bother with Vimeo streaming and building your own application. And your own ecosystem. If YouTube is just that good, they sort of have subscription ish things on YouTube. But yeah, I, I think only if you're at a certain point where it's like you have a big enough audience and you can get like one to 2% of them to like convert and having a streaming thing Makes sense.'cause then it's like, well, why not just do a Patreon or a Beehive subscription or something else? Yeah. That's why I'm just like, it's a narrow, it's a good idea. Yeah. But it's just like, it's such a narrow line of creators where it's like they fit that bucket where it's like a. Apple TV streaming app with enough people to pay. Yeah, like that. That fit that Venn diagram. Oh, I got a good that, oh, I got a good use case for Vimeo. Okay. You're gonna love this. All right, let's go. Okay. Gas station tv. Or hotel tv. So if you pull up to a gas station, that thing that's playing on top of the pump, like whatever garbage service that is, that could be hosted on Vimeo streaming, and you could just curate a bunch of like. Mindless content on it that just 32nd loops. And that is, uh, if you get to, that's more like TikTok, that's like, yeah. But like if you license enough content and then you could resell it to gas station networks, you can make some serious money that way. The other thing, like every hotel lobby that I walk into, well, not the high end hotels, but like a Holiday Inn or, uh, whatever, uh. DoubleTree, uh, there's always a TV just playing that same gas station type content over and over again. And I think if you are like a at scale supplier of that content, you could use something like this infrastructure to do it. I think there is a company that just targets doctor waiting rooms. And they have, it's like a tv, you know, streaming feed specifically for the doctor waiting rooms. And then they like sell ad space and other content. That's it specifically for that type of use case. How about when you get into an Uber? In the back of the Uber, there's an iPad playing that stuff. There is all, there's all types of ways to get content. Eyeballs on content. I don't know if that's the kind of eyeballs that like YouTube content creators want to build a long lasting sustaining audience, unless this is maybe Mr. Beast and he's trying to sell feasts bar. But this is not the creator economy. This is, uh, the taxi cab economy. Yes. This is somebody that just wants to come in the game, resell content and be the middleman. All right. This sounds like a good place to wrap it. Next time you're at a gas station, you'll think of us, I'm sure. Alright, well, thanks for watching again everyone. Uh, thanks for everything we talked about are available at denopodcast.com and thank you for your support. Uh, Joey, we got a lot of love at NAB. We really appreciate that. It means a lot to us. Thank you. Yeah. Thanks for all the podcast, love. Thanks for watching everyone. We'll catch you in the next episode.

People on this episode