Tech Talk Africa

The AI Infrastructure Reckoning Conversation With Co-Founder Michael Michie

Season 2 Episode 1

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 1:02:29

What are your thoughts?

Tech Talk Africa | Season 2 Episode 01: The AI Infrastructure Reckoning 

Guest: Michael Michie, Co-founder of EverseTech and Special Advisor on AI to the Kenyan Government

Africa’s digital future isn't just about the code we write; it’s being laid in the cables, servers, and power grids that most people will never see. In this season premiere, we move past the headlines to ask the hard questions: Who owns the compute? Who controls the data? And who pays the energy bill when ambition outpaces reality? 

We sit down with Michael Michie, a tech founder and a leading voice in Kenya’s AI strategy, to dissect the systems beneath the story. Michael shares his journey from a childhood obsession with hardware to the "perfect pivot" for EverseTech: solving the infrastructure accountability gap in Africa. We explore why the oversimplification of AI scares him and why he believes the hype needs to "die" so the real work can begin.

Key Discussion Points:

  • The Cost of Entry: Michael breaks down the staggering reality of building a data center—starting at roughly 5 billion Kenya Shillings ($38M+) just for the facility, before a single GPU is even plugged in.
  • The Power Paradox: We dive into why "energy" is the most critical infrastructure milestone for 2030. Michael explains why many high-end servers purchased by organizations sit idle because they require specialized three-phase power that standard office power grids don't provide.
  • Data Findability vs. Scarcity: Contrary to popular belief, Michael argues Africa doesn't have a "lack of data" but a findability and storage issue. He explores how to unlock "machine-readable" data from healthcare and traditional recorded information.
  • Sovereignty and "Going Local": His advice for CIOs is bold: "Get off the global cloud." Moving AI workloads to local infrastructure can be roughly 30% cheaper and is essential for data sovereignty.
  • Climate & Context: We tackle the "Animals vs. Machines" debate. Michael contextualizes AI's environmental impact, noting that while local water and power issues are real, industries like agriculture currently have a much larger global footprint.

Is Africa ready to build the foundations that carry our tech ambitions? Hit play to hear Michael Michie’s blueprint for a sustainable, localized AI future. 

Subscribe for a season of deepening the conversation.

Credits
Host:

  • Stella Gichuhi

Producer:

  • James Njoroge

Executive Producers:

  • Harry Hare
  • Agutu Dan
SPEAKER_01

Africa's digital future isn't being written in policy rooms, it's being laid quietly in cables, in servers, in power grids. Every promise of AI, every claim of transformation depends on infrastructure most people will never see. Who owns the compute? Who controls the data? And who pays the energy bill when ambition scales faster than reality? Because on this continent, the future doesn't fail because of ideas. It fails when the foundations aren't built to carry them.

SPEAKER_00

I think that those who've seen it and taken different approaches. Climate is a big issue globally. And everybody's like, all these data centers propping up, are consuming a lot of power, and you know, there's an impact to climate. Not as much as agriculture, shockingly. I would also want to partner with them, but I want to partner maybe a bit differently. I don't want to be a reseller of their tech. Yeah. Yeah. I also want a piece of the IP.

SPEAKER_01

That is from Michael Meshe, an amazing tech founder, AI enthusiast, and he is literally the Kenyan government's special advisor on AI. This episode is about the systems beneath the story, the power beneath the progress, and the infrastructure shaping Africa's digital destiny. This is before the rules are even written. This episode is the AI Infrastructure Reckoning. Perfect. So in this episode, we're going to dissect the infrastructure of accountability and uncover what it really takes to build skill and survive in Africa's tech ecosystem. Welcome, Michael.

SPEAKER_00

Thank you, Stella.

SPEAKER_01

I'm so excited. I'm so excited. I'm always excited of my episodes. So before we get into our relationship, AI technology, I want to ask you could you take me back to the moment you realized AI infrastructure that's compute, data, and energy was no longer a background concern but the real problem to solve?

SPEAKER_00

I think there were two moments that it occurred. Yeah. The first one I did ignore. You ignored? Yes. So earlier on, when I was a kid, um, I was I or got introduced to video games. Okay. And part of the access to play those video games involved me disassembling and reassembling the computer hardware. And if I reassembled it badly and the machine didn't start, I wouldn't get to play. But because the games are very flashy, they look good, yeah. I put a lot of focus into video games and coding and all of that. Yeah. And I was around class six at that time. Okay. So that's when like I had that choice of hardware, fall in love with hardware and fall in love with uh the software. Okay. But I fell in love with the software, yeah, and really disregarded the hardware for a very long time, even in campus and even when I started working. Yeah, hardware for me was just like it's just there, it's there to support everything else. Then the second time it happened, I actually paid attention. Um, at that time I just founded Invastec and we initially wanted to build AI compliance tools. Okay. This was right before ChatGPT 3.5 came out. Yeah. So the idea was uh train an AI model to understand compliance and have it flag compliance. Okay. Had a demo to show. Um, I ended up attending, I think, uh, an event hosted by an Estonian coalition.

SPEAKER_02

Okay.

SPEAKER_00

And as I was speaking to one of the people uh over there, they asked me, so you have to send the data to cloud for it to be processed, and then you bring it back in the country. And I was like, That doesn't make sense, doesn't it? Yeah. So I was like, okay, let me think about it again.

SPEAKER_02

Yeah.

SPEAKER_00

And so I then spent about eight months trying to figure out where to pivot. Because now it was it was an issue of this is not gonna work. And I was thinking of something to pivot. Um one day, I think that was 2023, the idea then now actually hit me properly was infrastructure is gonna be expensive. Yeah. So try and do it. So I first tried to do it through my nonprofit.

SPEAKER_01

Yeah, which is Everse or another one.

SPEAKER_00

At that time was Niyazaki Lizetu Foundation. Okay, okay. So I wanted to get through that, which was can I gather different infrastructure players? Date a data center, an ISP, someone to provide the GPUs and people with use cases, and sort of form a consortium. Okay, and then try and raise funding and manage all of that. But then uh speaking to I started doing my ground research, talking to people, I realized investors wouldn't touch it unless it was a for-profit.

SPEAKER_01

Okay.

SPEAKER_00

So that was like, oh well, I've been trying to find a pivot for Uvasec. This is the perfect pivot.

SPEAKER_01

So let me hold you. Let me you wanted all this to be non-profit?

SPEAKER_00

Yes. I I I I I don't know what I was thinking, but it felt like such a good and big idea that I was like, it can't be done by one company. No, no. And I thought if it's done by multiple companies, and then because I had just gotten my licenses for the non-profit, I was like, why not throw the non-profit? But then now, after speaking to people, getting some advice, you know, the best approach was now to go the for-profit route.

SPEAKER_02

Right.

SPEAKER_00

And so I then started trying to figure out what's the cost of me building a data center, and I was not gonna raise that kind of money.

SPEAKER_01

How much what's the typical cost of building a data center right now?

SPEAKER_00

Um, if you're gonna build about five billion Kenya shillings at minimum. Oh. And that's just the data center.

SPEAKER_01

That's just a data center.

SPEAKER_00

Yeah, so there's no computing that data center. Uh yeah. So it's it's very expensive. So I wasn't going uh doing a pre-seed or an angel raise for five billion. Yeah, no, you're not. But then I realized why why not just leverage one of the biggest things Africa has going for itself, the partnerships.

SPEAKER_01

Yes.

SPEAKER_00

I have something, someone else has something.

SPEAKER_01

Yes, yes.

SPEAKER_00

So I have there's the expertise, there's a team of developers, there's understanding what how to localize use cases and models so that they fit our economic structures Africa. Yeah, someone else has already built a data center, someone else has access to GPUs. Okay, someone else is an ISP because that's also needed. An ISP will be the internet service provider. Yes. Okay. So finding the different partners became another 18 months of shopping around data centers, speaking to people, gaining the knowledge. Uh you eventually uh we had a great moment. We were able to find the right partners um through Atlantis and IX Africa, the data center.

SPEAKER_02

Yeah.

SPEAKER_00

And then months later, almost a year later, yeah, the GPUs arrived finally.

SPEAKER_01

I remember, I remember you called me. It was May 2020, no, uh Q325. Yes. I said, guess what? The GPUs are here. It was like Christmas come early. Christmas came in Q2. Yeah, yeah.

SPEAKER_00

Yeah, so that's been a great partnership. Yeah. And so we're rolling out slowly. Uh obviously, it's still infrastructure for me, it's moved on from being that boring thing to it's both a necessity for sovereignty, yeah, but also it underpins the necessity of ownership. Yes. And also just being a bit more conscious about the mode of delivery. We I know there's there are a lot of plans across Africa to build these mega data centers. Um, I'm not opposed to them. If you're building one that's gonna serve the whole continent or half the continent, I do see value in that. I don't yet see the value in building something as big as um like the Groc Data Center. If we are trying to build something similar to that as Africa, yeah. I don't know if we have enough use cases to keep everything running because we are remember we are we are predominantly SME driven. Yes, we are. So we we sort of have to build like localized we we can build many small ones that can that can be sustainable as opposed to building one big one because if it's built in Kenya to serve the rest of Africa, maybe other nations would be like, well, it's still cloud to us, it's not sovereign, it's not in our borders, we also want one of our own. And everyone would want one. So I guess I don't know, it's it's been a great journey.

SPEAKER_01

It sounds like no, we've walked this journey, we are walking this journey together. Um, it's tough.

SPEAKER_02

Yeah.

SPEAKER_01

Um I remember calling you, lamenting about a course that I was studying, machine learning. I I come from a social sciences background. I said, Michael, this this is your world. This is your world, not even proper English, this is your world. It's not, it's it's tough. Um, and then you know, you've you've you've raised a very good point. As Africa, as Africa's other continent, our power is in partnerships. And I think you're one of the few who've seen that. Or would you say others see it but they don't know how to go about it?

SPEAKER_00

I think that those who've seen it and taken different approaches. Right. Um, there's those who said we'll build it, and people will come and we'll do the those who've said we'll build it, we'll do the partnerships, but we'll do the partnerships with the with the big the the the big tech companies globally. And it's obviously I would also want to partner with them, but I want to partner maybe a bit differently. I don't want to be a reseller.

SPEAKER_01

No, they are tech. No, you're yeah, you're not, you're not.

SPEAKER_00

Yeah, I also want a piece of the IP because I can also contribute to the IP. Yes. And we should as Africa contribute to IP a a lot more. Right.

SPEAKER_01

When you're looking at this landscape, especially in Africa, what's worrying you more? Because and I ask that because you log on to LinkedIn, there's AI, this AI capability. AI is like is the magic bullets the magic tool, it's it's it's magic.

SPEAKER_00

It's the oversimplification. It scares me because I oversimplify thank you.

SPEAKER_01

Okay.

SPEAKER_00

I I go to bed every day. Yes. Uh ritual for maybe close to 15 years now.

SPEAKER_02

Yeah.

SPEAKER_00

Go to bed. Last thing I ask myself, actually, first thing when I get into bed, yeah, is have I learned anything today? Right. If the answer is no, get out of bed. If the answer is yes, I deserve a good night's sleep. Okay, good, okay. And going down the rabbit hole of AI, there's obviously the machine learning. There's a lot in terms of just building out business use cases, figuring out how to embed it. Yeah. There's the models, there's a training, there's an evaluation, there's a security, there's a lot of and then there's the infrastructure.

SPEAKER_01

There's the infrastructure, yeah.

SPEAKER_00

So there's like so much. If I spend two weeks on infrastructure, I go back to the dev work, there's something new that I don't I haven't touched. I feel like, oh, I need to chase this, I need to chase that. And you keep going down that rabbit hole.

SPEAKER_02

Yeah.

SPEAKER_00

But then you find an oversimplification of AI. But it's just not just in Africa, but even globally. Okay, so it's a global problem. I think it's um I don't want to say it's a media problem, but I guess there's there's a sales component where you have to meet numbers. Air promises efficiencies, um, but it's um something helping you with your emails.

SPEAKER_02

Yeah.

SPEAKER_00

Helping you take uh minutes for taking notes in your meetings and all of that. Yeah. Um, so if if I'm able, if air is making my minutes, my meetings uh go down from eight meetings a day that were taking, let's say 30 minutes each, yeah, to them being 15-10 minutes. What do most people do with the extra time? Book more meetings. Yeah, so we become a hamster on our wheel. So we've become efficient at our inefficiencies because I can summarize everything. Yeah, so I need more things to summarize.

SPEAKER_01

Ah, so uh AI is making us more efficient at our inefficiencies.

SPEAKER_00

Yeah, as a short term, as a short term implication to adoption of AI without like properly thinking it through. Because you want long-term to to you want to show value. Yes. And sort of that that also led me to a lot of the things we're working on at Evastec, like it'll be nice to build a company a company comes and they want a chat bot to help them with customers so that customers don't queue when they call. Yeah, so there's a voice agent and a text agent on their site. So now you you're just channing calls a lot faster. What's what's the exact ROI beyond that? So, what happens to your team that's supposed to be receiving the calls, and everyone's like, oh, we just you know, repurpose, shed some HR costs, all those kinds of things. But as an organization that's like, well, that's just operational.

SPEAKER_02

Yeah.

SPEAKER_00

What how what actually the decisions themselves are missing? And so that moved me. Um, I remember last year telling people the biggest thing we'll see in AI in 2026 is gonna be context engineering. Context engineering. Yeah, yeah. And they moved okay. I quickly moved away from that during the December holiday. Yeah, I dove deeper into context engineering, and then I realized, oh, actually, context graphs might be a bigger thing than context engineering. I read that this year. So decisions, decisions where decisions are made, yes, is missing from AI.

SPEAKER_02

Yes.

SPEAKER_00

So then if AI can't support you in decision making for organization, it's just gonna be doing the routine jobs. Yeah, and then you get free up if your team gets freed up, yeah, but then you add on more routine work because now Tuesday's work can come on Monday, Monday's work will move to last Friday and so on and so forth.

SPEAKER_01

Okay.

SPEAKER_00

Just we just become hamsters on a rabbit wheel.

SPEAKER_01

Two questions, right? So there's the oversimplification of AI, yeah, and then what AI is making us if more efficient with our inefficiencies. We'll get to that later. So, in addition to oversimplification, what are the worries like access? Um now I'm talking access to compute, reliable data, energy, from your experience and what we're seeing, yeah.

SPEAKER_00

I think on the infrastructure bit, it's just the other challenge we are facing. There's an access challenge, but it's a global issue around accessing infrastructure. Uh so, like for example, I think Meta bought half of the RAM supply from I think Samsung or one of the uh random access memory.

SPEAKER_01

You're looking at me, you can see I'm I was raw. Thank you. Uh-huh.

SPEAKER_00

So when someone buys that much, obviously it affects cost. So actually, one of the things that happened last year was that the cost of RAM shot up, I think, 640%.

SPEAKER_01

640%?

SPEAKER_00

Yeah.

SPEAKER_01

So one major tech company has bought up for how what percentage? 40%. 40%. Yeah. So the rest of the world has been left to deal with 60%.

SPEAKER_00

Yeah. So now everyone is in a rush to produce more, but obviously production takes time. Right. They're limited uh resources in terms of the minerals and everything, the components that go into it.

SPEAKER_02

Yeah, yeah.

SPEAKER_00

So that has, and then there's still the GPU issue, which again people are booking GPUs and you're you're being queued up and you've been told oh your GPUs will be uh reading Q3, Q2. So even accessing some GPUs has become a challenge.

SPEAKER_01

So you're very blessed that yours came in when they did.

SPEAKER_00

Yes, but we also did wait, I think 18 months or so. There was a long wait in the year. There's a long wait. So yeah. And you you would want to see the kind of partnerships that you could access these GPUs. And I don't think we even sometimes need to access the best and the greatest and the latest. You just need to access what's what's needed for your use cases for your economy. You don't need to, you don't we don't need an open AI powered uh data center in Kenya yet.

SPEAKER_01

Yet, yes.

SPEAKER_00

Yes. We may need we may who knows what the future holds. Don't annoy the tech enthusiasts. So yes.

SPEAKER_01

Key word, yeah.

SPEAKER_00

Yeah, keyword yet.

SPEAKER_01

Yeah.

SPEAKER_00

But for now, you you sort of have to settle with what you've got and you make the best of it. And that's what we are good at.

SPEAKER_01

Yeah.

SPEAKER_00

Yeah.

SPEAKER_01

Yeah, I agree with that. So on the part of inefficiencies, efficiencies, it's and you talked about context engineering. Yeah. So now it we're not now talking about bigger models, we're talking about better systems. Yes. How do you foresee that?

SPEAKER_00

I'm actually hoping that this might be the year where the hype around AI dies. You think it will die? I I want the hype to die. So you want it to die? I don't think it will die. I want the hype. I want the hype to die is because I think people are we're getting that state where people are actually now starting to see less exponential growth around LLMs. So an LLM coming, uh scoring 30% on a test, like the I think it's uh I'm not I don't remember the test, but it's at it's at a close to an AGI test. You score 30%, you come back next time you score 50%, you come back next time you score 52%. That 2% leap is huge. It may not feel exponential, like it didn't double from 30 and go nearly to 60%, but those small growths uh in models um are gonna show us like a need to sort of redesign things. Yes, yes, but it's also going to if we give people a chance to stop. You know, you wake up today, there's this new thing in AI, you wake up tomorrow. So now we'll actually have a calm sort of sea.

SPEAKER_02

Yeah.

SPEAKER_00

People can finally like, oh, they can see this the skies will clear.

SPEAKER_02

Yeah.

SPEAKER_00

We'll actually know where like we'll now actually understand. Are we north, south? We're out here without a campus. And we've just been riding these waves. Yeah. Now people can see it and actually measure the volume. And people actually realize, oh, well, this was pointless, this was all hype. We can't actually AI can't actually do this.

SPEAKER_01

Yeah, because we're not. Let me go back to my studies because you've you have being the founder that you are, you have insisted that anyone in your team has to study about machine learning. Yeah. We're not in the age of AGI yet. That's artificial general intelligence. We are not. We are in the narrow AI. Yes. So how are we having tests for AGI when we're not or maybe I didn't I didn't quite capture what you're saying?

SPEAKER_00

It's the I think it's called the Arkbench or something. I'm trying to remember the name.

SPEAKER_01

Oh, okay, okay.

SPEAKER_00

So it's a benchmark to see if AI can solve. You see, the thing with AI right now, it's memory. Yes, it is. I usually I don't know how I keep saying this, especially being in that field. Uh but philosophically, to me, it's not intelligence, it lacks a theory of knowledge.

SPEAKER_02

Yeah.

SPEAKER_00

So AI for me has two things going for it. It can compute very fast, and it can has access to a lot of memory in terms of like what it can remember.

SPEAKER_02

Yeah.

SPEAKER_00

So it's something it's seen, yeah. It can always reference that and give you. So when it's given new challenges and puzzles, humans when you receive those challenges, especially the ones which are like most recently, Gemini was being the Gemini 3 and Chat GPT-5 were being tested on were issues were like things that would take humans 20-30 minutes, and they were taking two hours to six hours to solve. Oh. Yeah. So and they're just simple, like what's the next item in this sequence? You've seen those kind of but these are things which these are particular tests that they are 100% guaranteed did not exist in the training data at all. So now they have all these people have been coming up with all sorts of tests to see how far they can push AI beyond its scope.

SPEAKER_01

Okay.

SPEAKER_00

A good example usually say if we were living in the time of Galileo, and somehow someone built an AI, an AI model. The knowledge, the scientific knowledge at that time, yeah, is the knowledge that model would have.

SPEAKER_01

Yes.

SPEAKER_00

The model will not argue, as Galileo argued, that we don't things don't go around the earth.

SPEAKER_01

No.

SPEAKER_00

The earth and everything else is going around the sun.

SPEAKER_01

I agree with you.

SPEAKER_00

So the model will just accept that if the earth is the center of the solar system and everything goes around the earth. It's not gonna think outside the box. Usually tell AI be creative, think of this, think of this. But it always speaks from what is already existing. Because he doesn't know what he doesn't know. That's the theory of knowledge. So for me, that's why AGI is a bit far away. Especially for LLMs.

SPEAKER_02

Yeah, I believe it is.

SPEAKER_00

Maybe with a different type of if we look at maybe physical AI or maybe uh diffusion models and all these other types of image, video, yeah. Those ones maybe could show some some form of general intelligence, not in the capacity of like answering questions like the way we would answer, okay, but being able to perceive the world as we perceive it. Yes, yes. Like actually understanding, you know, like the okay, this foot here will fall. This I can if I drop this, I can catch it. If I drop this, I I know my screen is going to crack. If I drop the phone, and like an understanding of the physical world and how to interact with it without having to be trained. That would be AGI for those models. Yes, it would be. I think those models can get there a lot faster than LLMs. I don't think we can keep brute forcing our way with LLMs. Remember, it's just it's a very good guessing engine. But it's very good at it.

SPEAKER_01

Yeah, yeah. I I remember. Back to the issue of shortages, yeah. Right? Rising costs. Dependence on a handful of providers. Say Stella today decides um how to build an AI system. I don't want a demo, and that's what you've we've been doing at Tivas, right? Then how does this reality change on what they can build and who gets to compete? The reality of so you've RAM Meta have bought up RAM the RAM. There's that RAM problem. Yeah. Cost of GPUs. Yes. Things are changing. So of course, when all that all that is changing, it affects how a team operates. Right. Because also organizations, they don't want a POC. Now we're going into pilot. POCs don't make sense. Pilot. Now with that at the back of their mind, what can they build? And then who gets to compete? In your experience. Don't don't give your trade secrets.

SPEAKER_00

I guess those who can compete at this stage, you just need to have the cash. Money, it's just money. It's money. And you know a lot of people talk about just the GPUs. Yeah. There's still storage. Like companies that are involved in the company. I think there's this this very famous uh scan discs. These guys who do the flash discs. Their stock has been growing ever since this AI thing because they also build the they also build uh data center storage.

SPEAKER_01

Okay.

SPEAKER_00

So all this data to train all these models and to do all these things has to be stored somewhere. So they've been benefiting a lot.

SPEAKER_01

So we should buy shares and scan disks.

SPEAKER_00

I think we can call scan disks. I don't try to see that transcend and the transcend guys for the things about that old and the western digital, all those guys also do uh enterprise kind of storages, which is what all these guys are buying. So there's there's uh obviously the compute, there's the GPU, there's the RAM, processors, storage, yeah, networking, power, yeah, cooling.

SPEAKER_01

Yeah, yeah.

SPEAKER_00

The whole infrastructure lay so big, but someone says, I'll get the GPUs. GPUs can't just you can't just buy a GPU and put it on the table. No. With the it has to be plugged into something. The GPU needs to sit on a motherboard. That motherboard needs a RAM, needs a processor. The motherboard itself has its own components, then there's the networking. It needs to it is you're gonna store things somewhere at some point because the model is gonna sit in storage. The model weights are what gets transferred to either the GPU or to the RAM as well. And you're most people use the RAM for a lot of caching because you're also trying to a lot of caching.

SPEAKER_01

Caching, caching. Oh, like it's like caching C-A-C-H-E.

SPEAKER_00

Yes, okay, so because you don't want every time models to constantly refer to new like every new inference, it can always go back and check how have I done this before?

SPEAKER_02

Yes.

SPEAKER_00

Okay, I've done this before. Let that be my guide. Yeah. As opposed to like now being a looking like for a needle in the haystack kind of situation.

SPEAKER_01

Now we all know that that you you can't okay without data. Yeah. And there's a whole life cycle of data. We've gone from now it's not about the volume of data. First of all, okay. Globally speaking, now it's about volume of data, unlocking data on in the global south. But for the global north, we have the data. Yeah. But now it's about data readiness. We're talking clean pipelines, lineage, governance, everything. Yeah. Right? So the question is, and we say so back to the global south, we don't have enough data.

unknown

Right?

SPEAKER_00

I don't I wouldn't say we don't have enough data. Okay.

SPEAKER_01

What what's what's your standpoint on data? I've gone round and round, but what's your standpoint on data?

SPEAKER_00

I think we have uh a findability issue of data. What's a findability? What's defined findability? We have it. We just we it's so fragmented at times we're like this data doesn't exist. But I'm like, human interactions lead to the creation of data. Like it's it's communities, even in the past, um, we they used song and dance and storytelling, yeah, and then they moved into when co and now people started writing.

SPEAKER_02

Yeah.

SPEAKER_00

Then after that, after writing, we got into like um like use use of wax to store sound. Then went to magnetic tapes for sound and audio and other digital formats. So when we're saying there is no data, I'm like historically humans have been recording data in very fragile systems, yes, some which are memory-based and they need to be passed down by word of mouth or through interaction, others who are on paper and other fragile items. So, and when I think of the global south, I'm like, it's not like we we we live our lives without data, it's almost impossible. It's just we're not just we're not generating the volume of data that is being generated elsewhere, especially on digital platforms, but this data exists. You just need to know where to find it. So, for example, we are saying we don't have enough data uh in the healthcare space for Africa.

SPEAKER_02

I'm like that's that's a treasure trove.

SPEAKER_00

This data exists, but it's so fragmented, and then we are like you're also missing out on traditional data. What's traditional data? Okay, in in the health space, my idea of the traditional data is all these traditional remedies and and all these traditional prescriptions and you know, like eat this for this, and this will be healthier. We don't because I probably want to do some work in like healthcare.

SPEAKER_01

Yes.

SPEAKER_00

And I'm like, I don't have enough data sets for the African context.

SPEAKER_01

Yep.

SPEAKER_00

Why am I searching online for this data? Because I'll search online. Yeah. I need to visit these hospitals, I need to visit clinics, I need to visit government bodies, I need to visit agencies and do the heavy lifting of collecting some of this data.

SPEAKER_02

Yeah, yeah.

SPEAKER_00

I could collect all that data and then find only uh what maybe 20, 30 percent is of value. Yes. And scrap off the rest. Yes. It's probably going to be 2,000 times more than what I found on the internet still. So it's I think we just have a findability issue. Okay. Which also some in some extent goes to a storage issue. Yes. If everything has been written in paper and been thrown in a in a warehouse, paper is gonna degrade. Yeah. If it's something that's been passed down as a trade secret, that's gonna get lost at some point. Yeah. So there's a huge, there's huge value. I think I think we have a lot of data. Okay. And even on like, especially for where Africa is very known for, like uh on social media, yeah. We're creating a lot of content as Africa's and that content can still be used for different use cases, for different values, yeah. Uh to train models for so many, so many other things.

SPEAKER_01

So, oh no, sorry, what does good AI data infrastructure then look like based on so it's not that we don't it's not that we don't have enough data, we have a findability issue. Yeah. Good AI data infrastructure. What should it look like? Or would you what would you like in an ideal world?

SPEAKER_00

In an id in an ideal world, you'd want uh machine readable data. Machine readable, okay. Yeah. That is easy to access and that is stored and versioned properly. I think I know there's always the quality of data versus the quantity of data discussion. Yeah. And it for me it's it only sits at the fact of the use case. So if you're saying we need vast amounts of data to train an AI model, my question is what's your use case?

SPEAKER_02

Yeah.

SPEAKER_00

If your use case is, for example, credit scoring, people's names have nothing to do with credit scoring.

SPEAKER_01

No, they don't.

SPEAKER_00

You would you can remove that whole section and maybe you lose if you had really that much vast data, maybe you lose 100 plus GB of data, which are just names. Then you find another, uh, maybe you have something, I have another field that is irrelevant to your use case. You remove it. And you keep doing that, and you end up with maybe less data, but that data can train a model for your specific use case.

SPEAKER_01

Yeah.

SPEAKER_00

Because giving a model junk, hoping that you know the quantity of data will make the model better for your use case. It's all they say, it's garbage in, garbage out. Yeah, and you know, you one would argue, well, why are all these big tech companies training their AI models on almost every piece of data they find?

SPEAKER_01

Yes, yeah, why is that?

SPEAKER_00

Well, they're building GPTs. They are building general purpose technology.

SPEAKER_01

This g is it general purpose technology or is it generative Let me tell you. You again making me study. Uh generative pre-trained transformer models.

SPEAKER_00

So I'm so proud of myself. So coincidentally, yeah, it also translates to general purpose technologies. Because I'll use I'll use it to help me with my emails, I'll use it for shopping, I'll use it for research. Yes, you will. I'll use it to draw, I wanna redo my living room, I'll use it for that. Yeah, I wanna I'll use it for also helping me with coding and everything. So those are general purpose technologies, and they thrive on that lack of specialization, also means they need to be trained on just enough of everything to be almost good or okay across the board horizontally. They're not specific to a vertical. But if you're a good and you're building for a specific use case, train it for that use case, get the data for that specific use case because you want accuracy, you don't want to just say, oh, we have a big 10 billion parameter model and it's failing half of the time. Build a 2 billion parameter model that has a 98% success rate.

SPEAKER_01

And is that possible?

SPEAKER_00

Is it too costly? So the cost the cost issue again for me comes back to infrastructure.

SPEAKER_01

Okay.

SPEAKER_00

So a lot of so there are two things within AI infrastructure that people don't realize that is actually uniquely different from traditional uh let's say server setups.

SPEAKER_01

Okay.

SPEAKER_00

You have training and so most people who build have a uh a staging environment for their tools. Yep, they have a pre-production environment and a production. Yeah. And then you have your DR, your disaster recovery, which may be a re maybe just the either your produ only your production or your production plus your your pre-production environment. So you have what you have and you have a a replica on the side. With AI, you have your training infrastructure, which is different from your inferencing.

SPEAKER_01

Ah, you have it. Okay, so how many environments are we talking with AI?

SPEAKER_00

For me, you add your you have your train your pre-production, which is where you do your training. Right. And it's not going to be the exact same. Because you see, most professions just say, ah, in the pre-production, we can we'll use 60% of the resources that we have in production, because in production expect more traffic, more users. Yes, yes. So we can use a smaller server. Training of models consumed, it takes up a lot of GPU. So you'll find you have to build for training and also have an architecture for inferencing, which is when people start prompting your model. Because at the training phase, your network costs are hopefully adjust your own internal costs because the data is as close to where the to the GPU as physically possible. It's literally a cable away.

SPEAKER_01

Oh, yeah. Yeah, what am I saying? Oh, I already know.

SPEAKER_00

Yeah, but then at inferencing, yeah, someone is in Australia hitting a server in Ireland.

unknown

Okay.

SPEAKER_00

So network, there's a bandwidth thing, there's net so there's network to consider there's a RAM because now the weights have been stored either on RAM or on the GPU.

SPEAKER_01

Yeah, they are, yeah.

SPEAKER_00

There's cash in. So the infrastructure changes. And when most people look at the cost, especially if you want to build your own.

SPEAKER_02

Yeah.

SPEAKER_00

You assume, oh, we have a free server at the office. Let's just build on that one. It's not. I've seen organizations buy servers.

SPEAKER_02

Yeah.

SPEAKER_00

About$350,000 uh dollars worth of servers between about two to three companies in 2024.

SPEAKER_01

Yeah. Do they use them?

SPEAKER_00

They're not using them. They just bought them for the sake of it. You buy a fancy server, and you're like, guys, use this cool server to do AI on. And then you realize one, the server doesn't have GPUs. Another guy bought something that has a GPU. Right. But realizes they're using uh they're on two-phase power because it's a commercial building, but an office space.

SPEAKER_02

Right.

SPEAKER_00

And it needs three-phase power. So you plug it in, it isn't power on.

SPEAKER_01

Yo, now, sorry, I'm cutting you short. Now I understand why you really detest the simplification of AI. Because you've you've gone into the granular details of to get to that model, and we're model chat GPT is at 5.2. Uh Anthropic, I've not played with Anthropic yet. Google keep updating their models. So you for us to get to where they are, we need to reality needs to set in, from what I'm hearing.

SPEAKER_00

Yeah.

SPEAKER_01

Whoa. Let's talk energy.

SPEAKER_00

We don't even need to get to where they are.

SPEAKER_01

We don't need to get to where they are.

SPEAKER_00

We we actually, there's no everyone has always been like, oh, Air is supposed to make Africa like, you know, leapfrog, or you know, like always like jump frog, you know. Oh, we're gonna skip this in this part of all this industrial revolution.

SPEAKER_01

Yeah, so we don't need to go through revolution one, two, three, four, where we have arrived. But it's different.

SPEAKER_00

Yeah, like why get there and have the same problem someone else is having.

SPEAKER_01

Thank you.

SPEAKER_00

Find a way to get there without having their problems, but taking advantage of your scarcities and your and your context. In your context, yeah, which matters a lot. And if you look at energy, I remember. Yes, actually, energy is the next question. I remember the energy thing for me was actually one of the like when we were trying to find somewhere where this GPUs would be hosted.

SPEAKER_02

Yeah.

SPEAKER_00

The energy and the cost of energy, first the availability of that energy.

SPEAKER_02

Yes.

SPEAKER_00

So every data center has electricity. That's fine. How much electricity can they push into a single rack with your GPUs, with your networking equipment, with your storage, with your cybersecurity, your firewalls, your switches, routers, all of that, all the accompanying equipment. Can you fit a single rack and have power distributed to all those devices? Most of the data centers we visited, it wasn't possible. We have to split it into two or three. But then that means they take up more flow floor space.

SPEAKER_01

Yes, you mentioned that.

SPEAKER_00

I'm paying for more space. By the time you find a data center that could take you at max capacity. Because again, you don't want to hit max capacity and you can't pull enough power.

SPEAKER_01

Yes. And then it's not cost-effective, right?

SPEAKER_00

It's not cost-effective. Okay, okay. And then the power thing, then again, even for us, came as a shock because we needed specialized uh power distribution units, PDUs. And did you yeah, yeah. So I had to go back to like a physics class and learn all these things. And you're there like it's not that easy. It's not just you know, just to buy these things and stick them together. They're not Lego bricks, they are so and some of those things are so granular. Yeah, you don't you you just check online. Does this work with this? And the answer is yes, according to the internet. Yeah, you get your hands on it, yeah, and the answer is no. And you have to go back to the drawing board. You have to go back to the drawing board. We re-architected so many times. We bounced back between this type of GPU, that type of GPU, this type of server, this type of cooling. We we we had conversations with uh companies that provide on cheap liquid cooling.

SPEAKER_02

Whoa.

SPEAKER_00

We had conversations with companies that were talking about immersion cooling.

SPEAKER_01

Whoa.

SPEAKER_00

Because then again, cooling is part of power. Okay, so one of the things you pay for is obviously the power to run the device. Yes. And you're charged for cooling, and cooling requires power.

SPEAKER_01

Oh, that's so let's just say a double cost. So the electricity bill around AI is is no joke.

SPEAKER_00

It's no it's so locally, yeah, it's no joke.

SPEAKER_01

Right.

SPEAKER_00

On a global scale, yeah, if you look at it industry to industry, it's quite like AI's impact, especially around the data centers and everything. If and it's it's been covered well in the media that yeah, a data center is set up that can hold maybe 3,000 racks, and then you hear how the local community around there, uh, there's power issues, there's water issues, there's if some of them there's noise, there's pollution, even if the data centers in clean energy or they're using like uh some of the fossil fuels. So there's that. But if you look at it from a global scale, you're barely doing any damage.

SPEAKER_01

Oh, I feel like you want to go into the climate change AI infrastructure conversation. Hold that thought. Hold it, because I I I can I can see it. I want to ask. Yeah. In uh Kenya, we're we're quite we're very blessed in that we've detected, I believe, I hope I was I'm not making this up, but the power issue is one that the government had started talking about late last year, if I'm not wrong. What about countries who don't have it as good as we have it? I'm not gonna sugarcoat it. What does what what do you think their AI ambitions will look like or should they have any ambitions?

SPEAKER_00

They should, but don't build megastructures that you can't sell. Yeah, don't build mega structures you can't. Okay, okay. So work with what you have. Work with what you have or buy power from your neighbor if they are selling. If you want to build a megastructure, yeah, buy power from your neighbor if they can send you the power. But again, you don't want your neighbor switching off the lights randomly. If if that's a thing. It is a thing. Some people switch off the internet randomly. Yeah.

SPEAKER_01

Have you not been reading the news?

SPEAKER_00

I have. Okay, good. So if if you don't want that and you want your own power, yeah, and maybe you don't have enough for it, yeah, then you plan for your utilization to for your supply to be able to meet your demand differently. So instead of one big data center, okay, can you build modular units spread across different places?

SPEAKER_01

That one is that what you call co-locating, or is that something different?

SPEAKER_00

So it's a bit different. So it's now like well, data centers are built as also co-location centers for for enterprises and other organizations. But uh, in this case, we are saying a data center, for example, like IX is currently the largest uh data center in Eastern Africa.

SPEAKER_01

Okay.

SPEAKER_00

So and they're your partner? Yes. Yeah. And we're saying maybe they are too big for a country that doesn't have the kind of power we have or the kind of power we can get from either hydroelectric, geothermal, and solar and all the other uh renewable sources. So they have to rely on either fossil fuels, which could be expensive and there's a climate issue around that, yeah, or work within your means, build smaller IX Africans, tinier ones, and spread them across because you just need to do load, it's not like load distribution.

SPEAKER_01

Okay.

SPEAKER_00

So don't push everything into ones. Yeah.

SPEAKER_01

So what you're saying is just don't go big. Then normally the thing is go big or go home. But with with AI infrastructure, you don't have to go big.

SPEAKER_00

Work with what you have as you build to big, but have a distributed network.

SPEAKER_01

Ah, distributed network.

SPEAKER_00

Yeah.

SPEAKER_01

Your favorite topic: air infrastructure and climate. Let it rip. Let it rip. Let it rip. There was an article on LinkedIn, we had a conversation about it. And it's it's there's a certain president who would be very happy if he had this. So it's almost saying air infrastructure and climate change don't touch them. What's going on there?

SPEAKER_00

So obviously, the climate is a big issue globally. Yeah, it is, and everybody's like all these data centers propping up, uh consuming a lot of power, and you know, the there's an impact to climate. Yeah, not as much as agriculture, shockingly. Not as much as like we have more agriculture as an industry, and we needed to obviously feed the whole world. Yeah, we can't eat AI Michael. Yes, but it does more damage to climate than AI. The issue with AI is that it's moving so rapidly that if it was to be used everywhere for everyone, for everything, right? How many of those data centers would we need? How much of that now starts actually having a large global footprint? So we set up a data center, let's say somewhere in the US, and I think people have seen those videos where even the water, there's a water shortage, yeah, the cost of power goes up within those towns. Yes, we've seen it. That's a localized issue.

SPEAKER_01

But that was there before AI.

SPEAKER_00

Some of those issues are there before AI when people were doing the crypto mining. Yeah, but when you bring in a large facility and they need more power. And the cost of power goes up because again to now maintain that grid and to feed that grid.

SPEAKER_02

Yeah.

SPEAKER_00

And then the cost is shared across everyone on that grid.

SPEAKER_02

Yeah.

SPEAKER_00

So you end up with costs going up. They are taking in more water. Yes. And then now because it they are taking up more water, then there's probably less water going to residential areas. And seen I remember seeing those videos of a family in the US putting water in jerica, just the jericans, and like storing water. They're like, okay, this one is for tomorrow for the washing the dishes. This is for cooking. Yeah. Because their taps are running dry because of a data center near them. Those are local problems and they will exist. And it's when we say they're local problems, I'm saying that they are small problems. They obviously affect those communities a lot. Yes. But then taking that and saying the rest of the whole world will be like that. If we build data centers, is it the case? So like CO2 emissions from AI, AI based or where AI workloads sit in data centers, where's the CO2 emission coming from? It's coming from the data center. It's coming, you see, electricity will still flow to the data center. So the emissions by the electricity is what has been attributed to AI. If those servers were doing like the internet itself, yeah, has its own uh carbon footprint, and people will look at where those servers sit and everything else and the value chain. People have tried to argue, well, you also need to look at AI's footprint beyond the data center, look at the mineral extraction and everything. And I'm like, yes, let's look at it that way. But cattle farming still does more.

SPEAKER_01

So you want to argue animals versus machine.

SPEAKER_00

Yeah, animals do more damage.

SPEAKER_01

So we don't need animals.

SPEAKER_00

No, I'm not saying we don't need animals. I'm just so it's it's it's an issue of contextualization. Okay. So the context is yeah, it's AI's rapid growth could have significant climate impact.

SPEAKER_01

Could have, but it's not having it.

SPEAKER_00

Not having it. It's having very localized issues of like water, electricity, and when people scale them, they can see the impact it would have on climate. But the logistics industry, travel industry, agriculture, farming, all those are doing way worse than AI. And everyone is like we I think we have this thing, and I don't remember the exact phrase, but we do have a thing of if something could go wrong as humans, yeah, we focus on that one thing. Is it like a fear of if like what's the worst thing that could go wrong? Yeah, and we're like, we need to prevent that, even if it's 30 years from now, we want to prevent it now, and that then I guess even just away from climate leads us to like, oh, let's run off and do regulations of these things. And you're like, why why bother?

SPEAKER_01

So you want to you want the issue to materialize before you diagnose it?

SPEAKER_00

No, I want you I want us to be reasonable enough to say the issue is in 30 years.

SPEAKER_01

Okay.

SPEAKER_00

Can we track it and can we agree that at this stage is when we will intervene? We'll intervene. Because in the if we start intervening now, if you're intervening for so for an example, yeah, there was an article that said um uh what uh Scotland uses 27 that the data centers consume 27 million bottles of water. That's a lot of water. In hindsight, it was 0.0 It's 0.00, like there's one over no that's not true, it's two over 10,000 percent of the water for the country. Even if they multiplied their by if they actually they'd have to do ten thousand times the number of data centers just for to say data centers are consuming one percent of the water available in Scotland.

SPEAKER_01

But the news sensationalized it and made it look like the world is coming to an end. Because we don't have enough water.

SPEAKER_00

But it's also weird that like nearly, or I think it was a science documentary I was watching, yeah, that nearly 80%, you know, 70 something percent of the water on earth right now was created even before life.

unknown

Okay.

SPEAKER_00

So maybe we have a water creation problem. Maybe we're gonna maybe that that can then be okay. People who are like, oh no, we don't have enough water. Yes, we should panic because like we have water that's uh billions of years old.

SPEAKER_01

Oh my god. Now, away from chemistry and physics.

SPEAKER_02

Yeah.

SPEAKER_01

You've mentioned 30 years from now, we've talked about Scotland. Bringing it back to the continent. Yeah, the year is 20, when 20, 26 now. Yeah. The year is 2030, 2035. Yeah. What is the one infrastructure milestone you think we need to have achieved? One that will the one that me and you will look back and say we did not miss that moment.

SPEAKER_00

Cheaper power. That's it. Yeah, because people are gonna people are building data centers all over. I think Kenya has what, three or four data centers currently under construction. Yeah. So the data center will be there, but the cost, the data center is there. It's a real estate business by nature. Yeah. People have to bring in their servers and all that stuff. Yeah. If no one brings them, you've just built a big building that's gonna go to waste. Oh wow. So people who won't come because of the cost of power. When I visit a data center and for every shilling I pay in power, I have to pay 60 shillings in 60 cents in cooling. So that means my cost of power, every time I look at the cost of power, I have to do I have to multiply it by 1.6. Yeah, that's not cost effective. That's because then that pushes me to raise my costs. Yeah, and push that to my consumers, which shouldn't be the case.

SPEAKER_01

And do you think it's obvious? Do you think it's obvious to say government of the day, to entrepreneurs, to should they be lobbying and saying, hey guys, deal with the power issue?

SPEAKER_00

Yes. I think people should lobby. Like power should not just be a buy the way. A buy the way. Like we know that they say, Oh, find a special economic zone, make power cheaper there. Let the special economic zone should be where the power is being used.

SPEAKER_02

Okay.

SPEAKER_00

So you could imagine if you build a physical special economic zone, yeah, and you decide on this particular space is where we're gonna put this data centers and all this. And these guys for the data center come, they do the analysis and they're like, it's not the right place. You have to build somewhere else. Whatever they choose to build, to treat that as a special economic zone. I don't think special economic zones don't need to be like a physical place where you're told come do activities here. They can be where the activities are happening. If you're gonna build a you so if the activity happens, for example, in this studio, if we had the if we had like compute running over here and everything, this is a special economic zone and the power should be cheaper here. You don't need to be forced to relocate, because then again, maybe the environment that was chosen for that special economic zone was to favor manufacturing or a different industry. Yes, and now we have compute and data centers becoming an industry that's looking for space, but maybe that space isn't uh right for them. So instead of freezing them out, yeah, turn their spaces or their use of that into a special economic zone.

SPEAKER_01

That makes them more efficient. So it's not okay. For me, it's ta talent, talent, skills, but for what you're saying, it's power. The hierarchy of we need to do this by 2030 for you, it's purely power.

SPEAKER_00

I think talent can slightly solve for itself over time. I believe that as well. Yeah, I think it's just that people will seek out knowledge and skills.

SPEAKER_02

Yeah.

SPEAKER_00

Um, and there are cases where some knowledge and skills are priced out, yeah, but then people find a way around it.

SPEAKER_02

Yeah.

SPEAKER_00

But if we are saying that, okay, we have all this knowledge, we have all these skills, and we have the data, and we have the data, but we can't have the compute because there's not enough power or the cost of power is too high.

SPEAKER_01

Yes.

SPEAKER_00

There's a big problem. There's a big problem. If you buy a mobile phone, if you buy a mobile phone, yeah, but then the cost of like if they're charging it, charging it, or even airtime, yeah, is what 30% the cost of the phone. Yeah, it's not sustainable. It's like every three times you make a phone call, you've bought a new phone.

SPEAKER_02

Yeah.

SPEAKER_00

So it's not sustainable. So power for me should be treated in the same way. It's if those if there's no power, the servers are off.

SPEAKER_01

Got it.

SPEAKER_00

So the AI is gone.

SPEAKER_01

Got it, got it, got it. So so you've had it right here, folks. For Michael, it's power. One two last questions. Yeah. For organizations listening today, particularly the CIOs, yeah. What is one infrastructure decision they should make differently in the next 12 to 18 months if they want to be AI relevant?

SPEAKER_00

I'd say get get off the cloud.

SPEAKER_01

Get off the cloud, yeah.

SPEAKER_00

Yeah. Use infrastructure that's locally available in country. It's about 30% cheaper just to move off cloud and have AI workloads hosted locally. And that's just a default cost. And then now you factor in whatever benefits you get from how how better you orchestrate your model uh and the kind of things you do now from a developer perspective, yeah. It then makes it worth using AI because AI can solve problems, but you're like the cost of solving that problem should all be more expensive than the problem. And the problem. It's because you're you have to play your you're paying your cloud provider costs that you did not expect.

SPEAKER_01

I wanna I wanna interrupt you there before we wrap up. We had a whole cloud policy that was that was in cabinet, yeah, January of 2025, or end of December 25. We all celebrated it. Hey, Kenya has a cloud policy. But now you're saying get off cloud.

SPEAKER_00

Get off cloud that's like get off the big, I don't know, get off the global cloud providers. Uh go to the local cloud. Go local providers. Okay, okay.

SPEAKER_01

Yeah, okay, okay. Yeah. Last one, for real, for real, for real. So we're talking scaling AI. You've talked about a little bit, you've touched on there's cost to sustain the AI infrastructure, right?

SPEAKER_02

Yeah.

SPEAKER_01

This cost is not showing up in fund index or product or product announcements, right? Is it deliberate? Is it because vendors are locking in? What's what's going on?

SPEAKER_00

I think it's it's more around the financial modeling. So, interestingly enough, um if you look at what a lot of the tech companies, especially the global tech companies, are doing, they need to build data centers. Previous they obviously would raise money for that. Some have decided to use their cash supplies, others have gone into debt financing. So they're picking different avenues because obviously there's a there's there's there's a huge demand for this and they need to find money wherever it can be found. And they are targeting consumers, so they need people in the billions. Like uh Anthropic, for example, like OpenAI, uh obviously they have tools and AI models for businesses, yeah, and they have those for and consumers. Yeah, and consumers is is where I think they are gonna see the most value. So it's like let's get as much of a share of the market as possible, yeah, and eventually everything will uh will we'll eventually get off the spending curve and go into income that's profit. Yeah, that that there's there's that model. And those who are gonna go after the model for uh let's target enterprises, so like Microsoft will target enterprises with their co-pilot and they'll want to see returns through that. The approach to spending on AI becomes very different. And what some and I without mentioning names, what please don't.

SPEAKER_01

I don't want to lose you.

SPEAKER_00

Yeah. What some organizations do is you put that cost of the data center and the heavy cost of training the model and everything out of your balance sheet. So you run that through an SPV.

SPEAKER_01

It's an special purpose vehicle.

SPEAKER_00

Yeah. Okay, okay. So you need a data center, so you form an SPV, get a loan from a bank, give the loan to the SPV, not to the parent company. The SPV is doing that on its own. So the parent company, the the books look good. But remember that money is gonna be paid not by the SPV, but still by the parent company. So there's been that trend that's been going on around that. It's it's a financial, I don't say it's it's a gray area in finance. So it's it's legal to have SPVs. Yeah, it's is it healthy though? I'm not an expert in finance. I look at it and sometimes I'm like, it's worked in some places. I've I've I've I've seen SPVs being used for acquisitions and mergers. Uh seen them being used in the infrastructure space just to keep uh debt off your books could be something. Uh so it's and I think one of the hardest things around quantifying the cost of AI is that because people people just get to see the$20 they pay for chat GPT, and then they translate that into the cost of what local models and local developers should be charging.

SPEAKER_01

Yeah, but they've now I'm getting passionate about this because open air have been on this journey for years. Yes, so they can afford to charge you$20.

SPEAKER_00

And they've raised enough money.

SPEAKER_01

And they've raised enough money since the 3.5.

SPEAKER_00

Yeah.

SPEAKER_01

Me and you, it's like, why can't it be like charging? I remember a proposal you sent. We're wrapping up, but I need to make this point. There's a proposal we sent somewhere. Yeah. And it was, no, this is costly, this is too much. Like, huh? We're not gonna be like open AI overnight. Yeah, and I think now there's that burnout part because you barely sleep. The last few months, like December, you barely slept.

SPEAKER_02

Yeah.

SPEAKER_01

And then there's now that there's the cost to your health, there's the cost to your mental capacity. Yeah, there's so much.

SPEAKER_00

Yeah, there's it it's it's expensive to do AI infrastructure, and on top of that, we are doing air infrastructure and we're also building models, we are fine-tuning, we are helping business. Yeah, it's a very bespoke solution. It is a very bespoke solution, and it obviously comes at a cost. And you can't match someone who's decided I have a hundred billion to spend, you know, to capture a market and realize profits down the line.

SPEAKER_01

Down the line. They have they have a way to go.

SPEAKER_00

They have a they have money to ban them. They do. And they know that oh, our ban rate, we're gonna we're gonna spend this money for the next four years easily. But we don't. We don't have that money, and again, the scarcity that goes back to like we shouldn't say then we're not gonna get involved in these projects, whether it's infrastructure or model development. Yeah, we could. We have to find sustainable ways to do it. Yeah, and if it means building small or taking your time to build but getting it right, yeah, I think that's a lot safer than if, for example, let's say a startup, uh I really hope this doesn't happen. A startup in Africa sort of has this huge raise, yeah, and an investor puts in billions of dollars only to find out it was, you know, uh wolf in sheep's clothing kind of situation. It was an API to a larger model, and then you're speaking from experience, but this is gonna be episode two because I know. It does erode the trust of investors. Yeah, so at some point you're like, well, why have one person raise 10 billion and then find out it was a hoax?

SPEAKER_01

It's gonna affect the rest of us.

SPEAKER_00

It's gonna affect the rest of us.

SPEAKER_01

And we hate it, that's why we are okay.

SPEAKER_00

I remember as we finish. I remember I had, I think it was 2023 when I was hard there, decided Invastech is gonna get into AI infrastructure and add and add on our development skills on top of that. Um one of our potential investors I was speaking to actually asked me, ah, I see you've been in banking. Why don't you build a fintech instead? That one we can find.

SPEAKER_01

I remember this story. Yeah, it crushed him. I know, yeah. But but we're in 2026 now. Yes. You barely slept in December. Yes. You took some of my days because I was studying machine learning, algorithms, neural networks. Yeah, it's tough.

SPEAKER_02

Yeah.

SPEAKER_01

But I I like where this is taking us. I'm very happy that we work together. Yes. Um, I have so many questions, Michael. I think we're gonna need a part two for this because we can talk AI today, tomorrow, next week, because there's always something new. Yeah, there's always something new. Yeah, but thank you so much. Thank you too. To our listeners, that was the AI Infrastructure Reckoning. That's a conversation about what actually holds up AI once the hype fades. I'm your host, Stella Kishuhi. Thank you once again, Michael.

SPEAKER_00

Thank you, Stella.

SPEAKER_01

This is Tech Talk Africa, the show where we go beyond the headlines and hype to ask a harder question: who is actually building the future of technology on this continent? This is season two. Not a reset, but a deepening. Because the conversations have shifted, the stakes are higher, and the systems shaping Africa's digital future are no longer abstract, they're operational. In this season, we're going beneath the products, beneath the policies, and beneath the noise to the infrastructure, the people, the power, and everything in between holding it all together. I'm your host, Estella Given.

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

Africa Untangled Artwork

Africa Untangled

Talo Africa