The Tech Strategy Podcast

3 Lessons from Alibaba Cloud’s AI and Big Data Summit (196)

December 22, 2023 Jeffrey Towson Season 1 Episode 196
The Tech Strategy Podcast
3 Lessons from Alibaba Cloud’s AI and Big Data Summit (196)
Show Notes Transcript

This week’s podcast is a deep dive into Alibaba Cloud, a leading provider of cloud and intelligence services. They recently had an AI and Big Data Summit in Singapore.

You can listen to this podcast here, which has the slides and graphics mentioned. Also available at iTunes and Google Podcasts.

Here is the link to the TechMoat Consulting.

———–

I write, speak and consult about how to win (and not lose) in digital strategy and transformation.

I am the founder of TechMoat Consulting, a boutique consulting firm that helps retailers, brands, and technology companies exploit digital change to grow faster, innovate better and build digital moats. Get in touch here.

My book series Moats and Marathons is one-of-a-kind framework for building and measuring competitive advantages in digital businesses.

This content (articles, podcasts, website info) is not investment, legal or tax advice. The information and opinions from me and any guests may be incorrect. The numbers and information may be wrong. The views expressed may no longer be relevant or accurate. This is not investment advice. Investing is risky. Do your own research.

Support the Show.

 Welcome. Welcome, everybody. My name is Jeff Towson, and this is the Tech Strategy Podcast from Techmoat Consulting. And the topic for today, three lessons from Alibaba Cloud's AI and Big Data Summit.  Now, Alibaba Cloud, I haven't talked that much about them. Really important company. I mean, right up there in the top three to five cloud companies in the world, depending which service you look at.

 

Infrastructure as a service, they're very big, very big in China and Asia.  But also, very well positioned for generative AI, for the new AI tech stack. When most companies start to implement generative AI stuff, they generally start by dealing with their cloud service providers tools.  And then maybe they build their own stuff over time.

 

That's usually step one. So I've been kind of keeping a close eye on them and other cloud companies. And they had their AI and big data summit a week or two ago. which was, uh, in Singapore. I kind of watched the live stream, uh, some important stuff. It's, it's a good way to sort of figure out what's actually happening on the ground.

 

So I thought I'd talk about sort of my takeaways from that and how to think about the AI tech stack, which is pretty big deal early days, but, you know, like super important. So that will be the topic for today.  Uh, housekeeping stuff, yep, we have the, uh, the Beijing tech tour, which I've mentioned before coming up May 22nd to May 26th, only in Beijing.

 

This is our, let's call it economy version of our full tech tour, which is usually multiple cities, uh, over five, six, seven days. This one is more a couple of days. One city. We hit a. You know, four plus companies in one city have some fun. That's going to be, um, in may, if you're interested in that, we have some spots open and you can go over to Techmoat consulting.

 

com, see it there and the prices there as well, which I won't mention here, but it's. It's a pretty good deal, as I've kind of mentioned before, we're doing this, I think one time, we're doing this sort of lighter version of our tour. So that's kind of it. So if you're curious, go over to Techmoat Consulting or send us a note at info at towson group dot com.

 

Okay, standard disclaimer, nothing in this podcast or my writing or website is investment advice. The numbers and information for me and any guests may be incorrect. The views and opinions expressed may no longer be relevant or accurate. Overall, investing is risky. This is not investment, legal, or tax advice.

 

Do your own research. And with that, let's get into the topic. Now, no real tech news for this week. The only thing that kind of got my attention was, uh, Google has released Gemini. Really, it's, they just upgraded and renamed Bard.  But yeah, I mean, they're going after, obviously, generative AI in a big way. And, you know.

 

They should be winning this because they have what nobody else has, which is access to all the world's information because they're Google. So you would think that they would be just, you know, rocking it in generative AI, uh, but they're really not, uh, well, we'll see, you know,  There's a difference between who gets there first and who wins.

 

So yeah, anyways, Gemini is available. It's pretty impressive I've been starting to play around with it. I didn't think Book Bard was awesome I didn't use it very much. But yeah, anyways, that's kind of the news for the week. Okay, let's get into the topic now  Alibaba cloud,  it's it's a really important company.

 

I mean when you look at Alibaba and I know A lot of you follow Alibaba. Okay, it's, it's one of their big growth engines. After domestic e commerce, number two is cloud, right? But, and they actually, Alibaba had their earnings about a week ago and cloud growth has not been huge. You know, in terms of revenue, it's still in the three to 4 trillion range, a billion dollar range.

 

Um, it's not been a huge business. It's a big, it's a good business, but it's not significantly bigger than, let's say, international commerce or some of their others. And it's not showing the growth we see in something like international commerce, like Aliexpress had a really good quarter. I mean, they really kind of rocked their numbers.

 

Um, so I think the international division of Alibaba is kind of the one that I suspect will go public first. Because Showing pretty good numbers. Okay, but Alibaba cloud is the one that people pay a lot of attention to  They had a management change not too long ago  You know, it was Daniel Jang when he stepped down as group CEO He took over as cloud CEO,  but then he kind of stepped out quite quickly a couple months ago Which was never really explained as far as I can tell and now Eddie Wu is running that so Eddie Wu who's the?

 

the group CEO, Joe Tsai is the group chairman. Well, Eddie Wu also became cloud CEO and he has recently become  e commerce CEO. So the major businesses, it's that dude and Eddie Wu, if you're not familiar, he was CTO of Alibaba for a long time. He's, um, he was one of the founding members. So him and Joe Tsai.

 

are kind of the old founders that are back in charge, which is interesting. Okay. So Alibaba cloud, very, very important  established 2009  it's, you know, it's Amazon web services, right. For China. So it's, it's always been the sort of digital tech and now they're calling it the digital technology and intelligence.

 

backbone of Alibaba Group. That's their language. So you'll notice that they've added the word intelligence all over the place in the last six months.  But okay, big suite of cloud services.  The standard cloud services that most of these companies start with are what?  Computing, storage, uh, networking, security.

 

Those tend to be the core services for most cloud services, AWS, Azure. And then on top of that, they start to build other things like databases. Uh, applications, uh, management, uh, other things. And you'll see that, you know, the AI tech stack pretty much fits on this quite nicely. Um, although the difference, which is important,  it's turning out you really need an alternative tech stack all the way down to compute.

 

You can't just take the servers that have been running, you know, the cloud services and start using for AI. No, you need different servers. It turns out you need different chips, which, you know, OpenAI Sam Altman made headlines this week by saying he wants to do entirely new semiconductors  for AI and he wants.

 

7 trillion to invest, which is insane. And maybe he's being a little cheeky. But the point is, it looks like you need different chips. You definitely need different servers. On top of that, you need different database, which means you need different types of connectivity because so much data is flowing and being processed because AI is very compute intensive and data intensive.

 

So, really, all the way down, you need a different tech stack. And that's what  you'll see that Alibaba Cloud is kind of leading with this year is Uh, new services for computing, new services for data processing, especially for doing inference. Uh, that, that's kind of what they're talking about a lot. Okay,  and other facts.

 

Let's see, Alibaba.  If you break it into sort of infrastructure as a service, platform as a service, model as a service, which is what they do. When you look at infrastructure as a service, uh, Alibaba's in the top three. Uh, depending, you know, by revenue, by usage globally. So right up there against, you know, AWS and Azure, usually number one in China.

 

Okay. So  it's a big deal. Let me sort of go through my takeaways from the conference, uh, the summit, I guess they're calling it. I really have three  and number one,  efficient compute  and efficient data processing. So that's, I'm sorry, efficient data processing are the big bottleneck right now.  We've heard this from other companies when I was dealing with Huawei cloud.

 

They pretty much said the same thing  That come the bottleneck has been the computing power, you know, everybody's trying to get these chips You know mostly from Nvidia people are designing separate chips Sam Altman has made investments and you know  new types of chips. They call them TPUs as opposed to CPUs and GPUs.

 

Um, so there's a lot going on in terms of just getting the computing chips and scale necessary  and doing it in a way that's efficient, that's not crazy. Well, how do you do that? Well, you use a cloud service rather than building the servers yourself. It's more efficient to have a lot of distributed servers that are running and you can scale up and down.

 

So in terms of the core offering of a cloud service, computational capacity. Uh, that works for a I that can be done at large scale because it requires a lot, but can also done be done efficiently. You know, that's a lot of what Alibaba has been talking about this year, and we hear this from other companies.

 

So you'll hear these words a lot like Computing capacity and you'll hear efficiency, efficiency, efficiency. Um,  so scalability, how do you use GPUs? How do you use now tpu, efficiently? How do you network at high speed? Because the data has to come in and out of the servers, uh, quickly.  And usually when people are talking about this, they're talking about, uh, inference. 

 

as opposed to training, right? You train the models the first time around. You use a lot of training data. Uh, you get the models working. You start tweaking them. Okay, but then when you start deploying them,  that's when you're feeding it real time or other sources of data. It's doing the inference and providing the results.

 

So there's a lot more traffic and throughput and volume on the inference. And the computing power and networking speed become a big question. And the trick is to do it in a way  where the costs don't blow out. As you scale up, how do you keep doing it efficiently? How, and this was one of the questions that was brought up by the organizers, uh, the speakers.

 

How can you scale AI computing power without losing efficiency?  And that's kind of a hardware problem.  Um, so that's what they were talking about. And they basically talked about the two announcements they made as a company where one product was What they call their Elastic Algorithm Service, the EAS.

 

They kind of unveiled this as their new  serverless version of a platform for AI. So basically a cost efficient solution  for deploying models and doing inference, um, you know, on server capacity. So they're speaking directly to that question. How can users tap into computing resources as needed,  um, without having to sort of run the servers yourself, without having to keep your own physical or virtual services?

 

No, you tap into their EAS service, and you use it as needed. And the number they put out was, it results in a 50 percent reduction in inference costs. When compared to other traditional modes of doing this, so their serverless offering was kind of a big, you know, they're unveiling products at these events, but they're usually not unveiling 10.

 

There's usually one or two that they're talking about. So this sort of serverless offers offering was was one of the big ones, and it's rolling out right now that they're beta testing. It's You know, kind of  being used, you can use it for image generation right now, but they're going to try and roll it out for the other LLM and other types of models, uh, in a couple months, I think March. 

 

Uh, so that's,  that's one thing they talked about. The other thing they talked about in terms of products they're rolling out was what they call max compute, uh, max frame. Basically, that's a pretty  solid upgrade to their big data service.  So, if problem number one for rolling out, uh, generative AI and these sort of, you know, foundation models, which require a lot of computing, they require a lot of data, they also require  efficient and effective routing of the data into the model and out.

 

You can't, I mean, you have to gather the data, requires a ton of it, you have to clean it. You have to tag it, you have to do so with respects to privacy and security. You can't let everyone see it. Most companies, this is their own internal data or their customer data. So it has to flow into the models, it has to flow out.

 

So this gets into another  problem of how do you do that efficiently as the scale goes up and the volume of data goes up like crazy.  So same thing, so this is their Max Compute, which is really a big data service.  Basically, here's their pitch. It quote, allows users to process massive amounts of data more efficiently and flexibly with launching AI tasks such as LLM training.

 

Okay, so those were kind of the two big products they talked about, and you can see they kind of that's sort of my first point like efficient computing and efficient data processing are at the bottleneck right now,  and most companies are at that stage of adoption. They're still trying to build out the infrastructure and then taking their first steps in using models in their business and then starting to train their own models.

 

Uh, as opposed to using ones that are, you know, not theirs. Customized models, industry specific models, things like that. Now, one last point on the sort of first lesson is, there was an interesting short talk by a professor from, uh, National, uh, Nanyang Technical University in Singapore. Computer science, and, and, this woman works with, uh, Alibaba, I think, quite a lot.

 

She gave a little talk on, what was really interesting was, how much energy do these things take? And, you know, your average big model  Um, two times 10 to the six watts, a lot of energy. And she had one interesting point, which is how much does your brain take energy wise?  So an LLM model, two times 10 to the six watts to run one,  your brain takes 20 watts. 

 

Like, isn't that interesting? Like,  it's a really interesting thing. Way to think about the brain. If you think about it like a foundation model running lots and lots of apps, which is not a terrible way to think about the human brain.  What's interesting is it uses very little energy like a remarkably small amount of energy and it also doesn't require Very much data to reach solid conclusions  You know if LLM models are sort of brute force big compute big data big energy Brains your brain is a small mobile edge device  Which uses very little energy and requires very little data  to actually get to the answer.

 

It's really, you know, I thought that was kind of a cool point. I'm going to put more on this for those of you who are subscribers. I'll send out a lot more detail on this. And I'll put a couple of her slides in there, which were pretty neat. Anyways, okay, let's get to lesson number two. This is two of three.

 

We're going to finish up in.  30 minutes today. Okay. Lesson number two.  Uh, the big new model. I'm sorry. The big new service from Alibaba cloud this year is basically model as a service.  Everything I just described in the first point was infrastructure as a service. You're accessing compute, you're accessing storage, data  privacies.

 

And, and when you go from a traditional tech stack to an, an AI tech stack,  the level above compute is data processing. In the tech stack, and it becomes much more complicated once you go to an AI tech stack.  Uh, okay. As you move above the data layer, you get to, basically, the models. So you can start with the foundation models, the big mothership models, GPT.

 

Uh, for Ali Babitz, Qian Wen. which they've shortened to Qiwen, Qiwen. I think they say Qiwen. In Chinese, it's Qianwen. Um,  but there's several of these. Lama, obviously, is one. Um, Baidu has one. There, you know, there's a handful of these large foundation models, you know, some of which are proprietary, like GBD, others.

 

A lot of which are open source. Qianwen is open source. Most of the Chinese players, Baidu, Alibaba, Cloud, Huawei, they're building open source models.  Okay. You get to the model, and the first sort of level of that Stack is the model. Usually they're these  foundational broad models and then above that you get industry models.

 

And then above that you tend to get application specific models. So, usually you build the second and third layer models by customizing and tailoring the foundation models.  Uh, but within this, most companies aren't building their own models yet. What they're doing is they're accessing the models of the cloud companies like Alibaba cloud.

 

So model as a service is kind of their big offering this year. It's not specific to Chen Wen. You can run lots of different types of models and Huawei cloud is doing the same. They are not  You know, it's sort of agnostic. Now, obviously, they will push theirs. But what they're basically doing is they're redefining Alibaba Cloud as an intelligence capability. 

 

That's the word over and over and over. Now, it's a cloud service. Model is a service. It's a cloud service, so it's what you'd expect. It's pay as you go. It's ready to use. You can plug in immediately. It automatically scales.  So, same business model as cloud, generally speaking.  And what do you use it for?

 

Well, you can use their foundation models, you can do training, you can do inference, and you can start to use the basic, more standardized functions like voice interactions. Video intelligence, uh, natural language processing,  customer service apps, uh, industry intelligence, decision intelligence. I mean, there's, there's certain, um, applications that run on this that are fairly standard and can be plugged in.

 

Others, like let's say you're going to automate a factory, that's when you need far more specific and customized foundation models. But some of them are pretty standard. So, uh, Qianwen, which is actually Tongyi Qianwen. Okay, they've got. Uh, four to five of these models now, different types, uh, multi modal,  uh, image generation, LLM.

 

Uh, it's kind of a family of open sourced models.  Last time I looked, it was four LLMs and two multi modal. Uh, what's interesting is it's, it's, uh, multilingual.  That seems to be how they're differentiating, because obviously it's based on the Chinese language first. But they've also added other East Asian, Southeast Asian, and Middle Eastern languages. 

 

That's kind of an interesting differentiation, and you can see the same when we look at search engines.  They are different in China.  South Korea has its own search engine.  Because so much of it is context and understanding that you can, you know, obviously Russia has its own search engine. So, you could see some differentiation based on language, which would be quite interesting. 

 

Um, they talked about a Japanese company called Rina. Which has basically, it's building Japlit, uh, Japanese foundation models, but they're building on top of open source models, and they started with Llama, but they switched to, uh, Qianwen.  Uh, basically, because I guess they found that it worked better, when if you're gonna build, uh, foundation models in the Japanese language, it worked better building on Alibaba Cloud's models than, um, the others.

 

So that's interesting. So, okay, so you've got model as a service.  You can plug in and you can do all this, but really  it's kind of, you kind of need three things at the same time. Okay. You need the foundation models. They have them,  but also what you need is I'm kind of a library in a community.  You need people to start building on these, customizing them, making applications, things like that.

 

Now, in the West, the one everyone always talks about is Hugging Face,  which is kind of a model community, a model library, mostly based on open source models. Well, okay, they have, Alibaba has Modelscope, which is kind of like Hugging Face a little bit.  Um,  interesting, so you're going to have that community aspect.

 

The other thing you need is, is a model studio, which is. What they have, which is  like you need someone to help you  adopt these so that when you provide your own data set into the model, you can do fine tuning, you can start to get it to work for you, you can adopt it and integrate it into your various business scenarios and applications. 

 

And that's what they have as well. They have a model studio. So that's kind of looking like.  the business model for this model as a service. You've got to have your own foundation models, but you can also serve other people's models, which they're doing. You want to have some sort of library and community, which in their case is model scope.

 

And then you also want a model studio that works with clients, that helps them adopt, customize,  implement, do the ongoing training and ongoing tuning to keep them working. So that's kind of what model as a service is starting to look like.  Pretty cool.  And then on top of that, people will start to build, hopefully, lots and lots of applications.

 

And lots and lots of industry customized models. You're going to use a very different set of models for retail than you're going to use for banking. You're not going to use general GPT or general Tianwen. No, you're going to have an industry tailored version that you use for banking as your foundation.

 

And those are still being built out.  Which is interesting to watch because certain industries collaborate better. You kind of got to get everyone on the same page, and certain industries collaborate better than others. The other thing which is interesting in that is, you also kind of need industry wide data to be shared. 

 

Ideally, you want a foundation model that has been customized for, let's say, all the car manufacturers  of Asia.  So you have to have some degree of collaboration between all the players to build industry specific models based on whatever, GPT or whatever. But you also need Some degree of industry wide sharing of data. 

 

That's going to be interesting. China turns out is pretty good at this There are already sort of data sharing initiatives across like Shanghai has one where Shanghai has basically a government mandate and institute to share citywide data  So there's an interesting sort of industry collaboration required and there's a data sharing collaboration required. 

 

Uh, well, it'll be interesting to see how that works out country by country and industry by industry. Okay, so that's kind of number two is model as a service is the place to watch.  And we'll get to the last point and then I'll finish up. Lesson number three,  um, it's all about tech partners.  It's all about collaboration.

 

It's not. And it's not just about who has more customers and clients. Now, customers and clients. Those could be considered business partners. Fine. But no, this is a game. You know, this is a game for community building. You need  lots of cloud service providers. You need lots of AI engineering. You need lots of AI tools.

 

You need lots of foundation models. You need lots of infrastructure like compute and data processing and transmission,  but you also need lots of partners. You need lots of vendors. You need lots of applications built on your model or your system. You need tremendous integration with all the other tech services that are out there.

 

It has to plug into everything and integrate.  You need lots of ongoing model fine tuning and customization,  maintenance and monitoring. How much of all of that He's going to be done by one company.  I would say not too much. I think it's going to be industry wide. Lots and lots of partners. So the phrase you'll hear Alibaba cloud and most of the cloud companies doing this talking about they'll be talking about.

 

We've got to, you know, support our technology partners.  We've got to support developers. We've got to support our business partners, which is usually customers.  And they'll have all sorts of programs to support this. They'll have startup programs, they'll have free classes, they'll have generative AI academies.

 

It's almost like they're building entirely new cities in the desert and there's like eight cities  and it's all about which of the cities that everyone's going to move to.  You know, is everyone moving to, uh, GPT? Is everyone moving to Lama? Is everyone moving to Microsoft? Is everyone, I mean, are they moving to Alibaba Cloud?

 

Now, a lot of this will be agnostic, and you can, you know, sort of go between a couple of them, but there ain't gonna be 20 of these homes. There ain't gonna be 20 cities. There probably isn't gonna be 10 cities, right? So you're trying to get on that short list. Uh, I don't know. I mean, it could be 5 to 10, something like that.

 

But,  that's a lot of what it looks like it's going on right now. It's just,  who has the most tech partners? Who has the most customers? Who has the most developers? What is everybody adopting?  Now, interestingly, there's a, there's kind of a good JPEG to look at that was put out by Andreessen Horowitz last year, which basically maps out the new tech stack for AI.

 

Um,  I, I've put it in, Some of the emails before I'll put it in the emails for this podcast as well  It's a really good look at how the tech stack is gonna play out and even Alibaba cloud at the summit They used the Andreessen Horowitz graphic in one of their presentations like as soon as it came up I'm like, that's totally the Andreessen Horowitz thing.

 

Now, they did credit it. They gave the  Credit where it's due. So  this is a good JPEG. If you're going to look at one thing from this podcast, look at this JPEG for how the AI tech stack is evolving, because it seems like everybody in the world, this is what they're pointing to this year anyways. Okay, that is kind of the three points I wanted to make.

 

Uh, I'm going to write a lot more about this, because I think it's super important, and I'm trying to do as much of this as possible. I've been collecting use cases for generative AI, uh, going into businesses.  But yeah, the three sort of lessons from the summit, which was quite good, uh, efficient computing and efficient data processing are the bottleneck right now.

 

That is mostly an infrastructure question, and it looks like hardware is most of the solution.  Lesson number two, model as a service  is the big new thing  for Alibaba Cloud. That's Tianwen. But it's you want to break it into the pieces. If you can the foundation models, the data processing,  the industry models, the application specific models, things like that.

 

And then also you, you generally need, uh, model studios and other things. And number three last one was that,  you know, I think this year it's going to be all about tech partners, customers, business partners. And where does everyone sort of agree? This is, you know, This is where we build and this is the city we're all moving to and everyone's trying to be on that shortlist right now.

 

Anyways, okay, that is most of what I want to talk about.  Last little point, just to make this not so high level. There were some interesting use cases mentioned.  And this is kind of what I've been keeping a close eye on.  There was  somebody from Sephora Asia there talking about what they're doing with generative AI.

 

Chris Tung, the president of Alibaba Group, who's really  pretty cool. Wait, I got that wrong?  Sorry, I'm looking. His title on this sheet is President of Alibaba Group Strategic Development.  Now, he's Chief Market, Chief Marketing Officer. I'm not sure if they've changed his title or if he's picked up another one.

 

He's generally CMO of Alibaba. Uh, really interesting dude. Uh, and they had some, uh, what was another company? They had Halion. Anyways, and I looked at Lazada a couple months ago too, what they're doing with generative AI. And you kind of hear the same story over and over.  Okay, this is all cool. What are you doing right now? 

 

The number one thing most companies are doing are content creation and content personalization.  I thought Chris Tung had a really good point where he basically said, look, there's too much content in the world and generative AI is making it cheaper and cheaper. Everyone is just deluged in content.  And the only solution to that is you have to personalize the content to individuals or to groups. 

 

Because there's too much of it now. So it has to become a lot more personal and a lot more valuable. Virtually every company I've heard in terms of Generative AI, this is what they talk about. We're going to personalize our marketing. We're going to personalize how the app looks. We're going to personalize the content within the app that you see.

 

Um, if not at the individual level, then at the group level, the demographic. So, Generative AI use case number one, personalization of content.  Uh, the other one they talk about is the supply chain.  And we're going to use this for better demand projection.  How much do we need in our warehouses? And then we're going to use that to make our supply chain smarter. 

 

Uh, we'll use better, you know, what is AI? AI is prediction. So we're going to make demand prediction. We're going to feed that so we don't, we get better at our inventory. We have more of what people want immediately.  Uh, that's pretty much what you hear. The other one you hear is we're going to create a chatbot,  uh, or whatever we're calling the next iteration of chatbots.

 

Uh, you can talk to the company, you can talk about the product. We're going to try and do that. That's kind of mostly what I'm hearing.  In terms of use cases, but yeah, I'm keeping a close eye on it  Okay, that is kind of the content for today and I'm right at 30 minutes. That was right on time I ran long last week.

 

So I felt a little guilty about that.  Yeah I'm gonna send out more on this in the next couple of days for those of you who are subscribers. I sent you a  An article on automatic data processing today, I'm going to send you another one of ADP, which is a human resources, human capital management, uh, enterprise sort of SAS service in the U.

 

S. Very well positioned to do generative AI. Very well positioned. So I'm going to send an article on that. And then I'm going to send out a quick article on sort of this AI summit and what Alibaba cloud is doing. I'll probably give you a lot more information. Detailed data than you want, but it's worth looking at the technical architecture of this stuff  I find that really helpful to get into the details of what is actually being built  Anyways, i'll send that out in the next day or so and that is it for me for today for content  As for me, it's been a pretty great week.

 

Just working away. I'm heading down to the beach in a couple days I'm going to spend the days bouncing around some islands in southeast asia, which is  I do that about every two to three weeks just for fun. It's super easy. So it's a fun thing to do um one movie recommendation I saw a movie the other day that was like  I like science fiction like I used to love star wars before it became crap. 

 

I mean, I was a huge star wars fan, but I saw a movie called The Creator,  which someone had said, like, they're like, you have to watch this movie. It was fantastic. Like, I mean, it was one of the best science fiction, if you like Star Wars type stuff, it was fantastic. Uh, well, you have to watch it on like a big screen or really big TV, but it was really great.

 

Like, it's probably the best movie I've seen in a year. So if you haven't seen The Creator, watch that, that was, if you like science fiction. Uh, really nice movie. Anyways, that's the only recommendation I got. I'm, I'm watching BattleBots Season 9 right now. Which is,  I don't know why I'm so into it. I think it's, I think it's my strategy interest.

 

I think BattleBots is just another form of, of competitive dynamics and strategies. I think, like I'm no fun to watch BattleBots with. Cause I just keep talking about, like a spinner can't beat, you know. You know, a flipper can beat a spinner, but they can't beat a horizontal spinner. Like, I'm just going into the, like, which company's going to, which bot's going to win, which I think is fun, but I'm pretty sure it annoys everyone else in the room.

 

Anyways. All right. That's it for me. I hope everyone is doing well, and I will talk to you next week. Bye bye.