Tech Talk Africa
Welcome to Tech Talk Africa, the podcast that highlights the latest developments from Africa's thriving tech ecosystem. Join us as we explore innovations, challenges, and triumphs of African tech entrepreneurs and developers.
In each episode, we discuss trends shaping Africa's future—from fintech and e-commerce to agritech and healthtech—featuring conversations with those on the ground sharing their insights and experiences.
Whether you're a tech enthusiast or curious about Africa's potential, Tech Talk Africa is your guide to the continent's exciting digital revolution. Get ready to be inspired and informed!
Tech Talk Africa
Law, Code, And The African Future With Senior Associate Richard Odongo
Tech Talk Africa – Episode 10: Law, Code, And The African Future
Guest: Richard Odongo, Senior Associate (IP & Technology) at Bowmans
What happens when Africa’s most dynamic innovators meet laws that are still catching up to AI’s speed?
We sit down with Richard Odongo, Senior Associate (IP & Technology) at Bowmans (Law Firm), to map the real terrain: who owns AI-generated work, how data should move across borders, and what “responsible AI” looks like when livelihoods and culture are on the line.
We start with the major shifts: regulators across the continent are shifting from reactive penalties to smarter engagement, and countries like Rwanda, Kenya, and South Africa are creating conditions where innovators can succeed. Richard explains why dialogue-first oversight is better than quick fines, and how policy teams, ESG experts, and lawyers can work together to lower risk without losing momentum. His advice is practical: educate regulators about the technology, classify data by sensitivity, and use targeted obligations to protect what truly matters.
The ownership debate becomes real when AI creates music or drafts inventions. Richard describes how ownership depends on meaningful human input and tight contracts: attribution rules, licensing scopes, revenue sharing, and clear responsibilities among model providers, creators, and platforms. We also explore data sovereignty, from local copies for elections and health data to secure cross-border flows that enable analytics, BPO work, and safe travel systems. Infrastructure is also crucial—energy-efficient data centers, resilient networks, and ESG considerations that help AI stay sustainable long-term.
Ethics shifts from a buzzword to a blueprint: fairness, bias reduction, source credit, and transparency that consumers can see. And because context is key, we discuss localizing frameworks for African languages, cultures, and public services. Richard’s final advice for future legal tech leaders is simple but powerful: choose a niche, learn persistently, stay humble, and build with integrity.
Ready to think beyond hype to the real rules of the road? Hit play, subscribe for more conversations like this, and leave a review with the one change you want to see in Africa’s AI future.
Credits
Host:
- Stella Gichuhi
Producer:
- James Njoroge
Executive Producers:
- Harry Hare
- Agutu Dan
Welcome to Tech Talk Africa, where we explore who's shaping Africa's digital future and for whose benefit. I'm your host, Stella Gishuri, and today we're diving into one of the most important and often misunderstood intersections of our time. That's the law, technology, and artificial intelligence in Africa. As AI systems become part of how we bank, work, create, and govern, they raise big legal questions about ownership, accountability, and fairness. To help us unpack these, I'm joined by Richard O'Dongo, an advocate of the High Court who's never seen a court in person. He's a registered patent agent and senior associate at Oman's. This is where he advises on intellectual property, data privacy, and emerging technologies. Furthermore, he was recently named a Rising Star 2025 by managing IP and WTR1000. Welcome, Richard. And also, could you delve what is managing IP and WTR 1000 for those of us who don't know?
SPEAKER_01:Thank you. Thanks, Stella. Thanks for having me today. And thanks for the introduction.
SPEAKER_02:You're welcome.
SPEAKER_01:Yeah, so WTR, which is uh World Trademark Review and Managing IP, are legal directories, right? Focusing on intellectual property rankings. So basically, these are directories that basically conduct research on legal practitioners worldwide, every jurisdiction, they do client interviews as well. Yeah. They gather feedback on each jurisdiction, top lawyers or best lawyers, and then publish uh those results and rankings, which then obviously, first of all, increase our profile or enhance our profile, but also they're a good guide for clients when they're looking for legal counsel in the different countries.
SPEAKER_02:Wow, congratulations, that's a massive feat.
SPEAKER_01:Thank you.
SPEAKER_02:So when I say you've never seen the inside of a courtroom, but you can tell us where every IP law is, what it says, where every data center is, where every AI policy is.
SPEAKER_03:Am I right to say that?
SPEAKER_01:Yeah, correct. Um I never like litigation at all, at all. Um at least in this context. Yeah. The time it takes, you know, you're in court seven, eight, nine months with no resolution. No, I mean it's a good area of law to practicing, but just not my cup of tea.
SPEAKER_02:Just not your cup of tea.
SPEAKER_01:Yeah.
SPEAKER_02:So for our young buddy lawyers, actually, even current lawyers, litigation, it doesn't start and stop with litigation.
SPEAKER_01:Yeah, 100%.
SPEAKER_02:Yeah. Okay.
SPEAKER_01:And of course, my training needed me to at least appear in court once or twice with one of the leading lawyers on on those matters. But yeah, that was just about it.
SPEAKER_02:Oh, yeah, yeah. Oh, okay, okay. Let's talk about the stuff that excites you. So now before we get into AI, I want to be broad about it. You work daily at the crossroads of law and tech. So, how would you describe the current legal landscape for tech innovation across the continent? I.e., what's working and what's still catching up?
SPEAKER_01:Yeah, I think it's really shifting in in Africa, and that's across the board in the different, I'd say, subsectors. So you're looking at whether it's data protection, data privacy, telecommunications, whether it's fintech you're looking at, whether it's IP you're looking at, and cybersecurity, it's really shifting and changing every day. Of course, we're now even more of a global village. So obviously a lot of thought and thinking around um data sovereignty, data security, a lot of thought around, of course, AI and automation. So we're seeing now a lot of policies, um, strategies being drafted, a lot of thinking around that in Africa, of course, borrowing from what we're seeing in EU and of course um US with uh Trump's executive order on AI. So it's always shifting, shifting and changing.
SPEAKER_02:So because of this shift, there's uh always this narrative that uh tech tends to sprint or overtake regulation, i.e., regulation is junking behind. Yeah. And is there's a little bit of risk there, opportunity there. Can you expound on what is risky, where the opportunities lie?
SPEAKER_01:I tend to be an optimist. So for me, it's more of um an opportunity uh more than risk. And I think it's a chance for me from where I sit for regulators to actually sort of try and understand the tech and even sort of have a crystal ball on where do we think this tech is going, and then try and anticipate okay, what regulation we'll apply to where we think tech is going. Of course, it's very hard to regulate tech and hard to regulate innovation.
SPEAKER_04:Yeah.
SPEAKER_01:So I'm not sure how far that can go. Okay. But I think for me it's an opportunity just in terms of getting ahead of the curve. Granted, of course, a lot of regulators and countries have been caught flat-footed, so to speak, in terms of trying to react to innovation and tech, which then stifles, right? And Africa, of course, we're a consumer market. We're now becoming a bit more of a producer market, but for the longest time we've been a consumer market, so consuming what the multinational companies are.
SPEAKER_02:Are churning out, yeah, and providers.
SPEAKER_01:So I think yeah, there's an opportunity to also create and innovate, but of course, regulators should be a bit more proactive in terms of actually talking to the innovators, talking to the guys on ground, right?
SPEAKER_02:Yeah. So I guess you're the one who bridges that gap in conversation. So you can wear the regulator hat, you can wear the consumer hat, you can wear the techie hat. So you manage those stakeholders, being able to pivot around them, right?
SPEAKER_01:Yeah, yeah. I think also for professionals, is that room opportunity to actually become a policy advisor or policy lead? I think actually, in my view, policy leads are becoming a bit more prominent in that regard, just trying to marry the two worlds of regulation and innovation, try and be that middleman.
SPEAKER_02:Okay.
SPEAKER_01:So yeah, yeah.
SPEAKER_02:Oh, that sounds exciting. Yeah. Now on AI. Yo, so we're seeing AI tools everywhere: finance, education, agriculture, but they're also raising new legal and ethical dilemmas, isn't it? From your vantage point, how prepared are African legal systems to handle the implications of AI? You know, we're talking algorithmic bias, liability, data ownership. The list goes on and on and on.
SPEAKER_01:Yeah, yeah. Yeah. I think the perfect answer for this would be, you know, the readiness assessment matrices that have been done. But of course, that's a bit too technical.
SPEAKER_02:Being done where? And what's a readiness matrix? So speak to me like a five-year-old. Okay, one of those.
SPEAKER_01:Yeah, so I mean, at a high level, it's basically an assessment um done by the various, I would say, development partners.
SPEAKER_02:Okay.
SPEAKER_01:Um in Africa without naming specifics. Yes. But they do carry out uh basically an assessment of the country's laws, ecosystem in terms of innovation, tech adoption, all those various factors to then gauge and assess how ready is this country for, tech adoption or AI adoption. So, of course, the different countries in Africa basically are at different skills.
SPEAKER_04:Yeah.
SPEAKER_01:Obviously, in terms of readiness, purely because of things like mobile penetration, infrastructure, the normal things exactly. Yeah. Compute. But I think I would say from my point of view, I'd say we're at maybe 60-70%. Again, purely from the standpoint of since basically mobile use became widespread in the early 2000s and mid-2000s, we've seen a lot in terms of the telecommunication space, a lot of evolution, innovation there. Of course, internet connectivity, things like fintech, mobile money platforms, payments. Obviously, now crypto's come into the mix. So I'd say we are we're in a good spot. I'd say maybe 60% readiness.
SPEAKER_02:Okay.
SPEAKER_01:Yeah, yeah, yeah. To 70%.
SPEAKER_02:Those matrices, Richard. I question them.
SPEAKER_01:Yeah.
SPEAKER_02:Because who designs them? Are we as Africans in the room helping design the matrices? Because there was a paper I was reading a few months ago, I can't remember the name, but it was as if that there was a bias in those matrices. Who, how do you sit up and dictate, I think you're ready based on my vantage point when in reality is that a reflection of the country? Because a lot of our African countries are still in the development stage. Then it can't be one size fits all. So is it context-specific? Who am I to say Tanzania or Uganda is not AI ready? Yet we're very different people. We're African, but we're different. Do you have any thoughts on that?
SPEAKER_01:Yeah, no, no, it's a fair point. Uh of course, we're coming from the angle of yes, again, we are consumers and not really fast movers. Yeah. But at least now every country is now basically sitting up and going, like, actually, hey, let's actually do our own readiness assessment.
SPEAKER_03:Yeah.
SPEAKER_01:Let's do our own strategy or our own policy. So it's being localized in that manner. Because obviously, all these development partners all have their own different interests, right? In terms of whether it's funding, innovations.
SPEAKER_03:It's a fair point. I roll. Yeah. It's an IRO.
SPEAKER_01:Yeah, no, yeah. It's tricky, it's tricky to say that we have full and fair representation in terms of those uh matrices, but yeah, that's shifting, at least because each country is now reacting. And of course, you also have the AU strategy.
SPEAKER_02:Yeah, we do. Uh yeah, actually, that was written by the African, for the African, the African. Okay, fine, fine. I might I might pull back on that. A little bit. So basically, what you're saying is that we're not starting from zero, but we're defining the rules of engagement. Yeah. Is there a country you've read about and observed and said they have it right? They're getting it right, aside from Kenya, of course. Yeah.
SPEAKER_00:For obvious reasons.
SPEAKER_02:For obvious reasons. But is there a country you've thought they're they're they're they're getting it right?
SPEAKER_01:Yeah, honestly, I'd say 100% Rwanda.
SPEAKER_02:Oh yeah? Okay.
SPEAKER_01:Yeah, yeah, in terms of readiness, in terms of just contextualizing AI adoption, making it easier for people to actually do business in Rwanda. Yeah. Whether it's obviously the industrial parks, things like ACZ, just, you know, basically stimulating that that that side of the economy to attract obviously even innovators from within Africa. You know, for the longest time we've been trying to make ourselves attractive to foreign investment.
SPEAKER_03:Yeah.
SPEAKER_01:Yeah, but why can't we then make ourselves attractive to each other, you know, within this African ecosystem? So I think Rwanda on a good path. The right path, of course, um, aside from Kenya and obviously South Africa.
SPEAKER_02:Okay.
SPEAKER_01:Yeah.
SPEAKER_02:Yeah.
SPEAKER_01:I'd say, yeah, those would be my top three at least.
SPEAKER_02:What about Nigeria?
SPEAKER_01:Yeah, Nigeria, Nigeria also on and they were they're on the right path, but I think where they sort of lost me was was with the bill. There was an AI bill, right?
SPEAKER_03:There's an AI bill. Where have I been? Tell me about this bill.
SPEAKER_01:In 2024. It was like an eight-page document. Um, of course, it just created.
SPEAKER_02:Imagine on AI, right? On AI.
SPEAKER_01:It was about eight, nine pages. Um, I remember I read it and I was like, actually, no, this is not how you go about it.
SPEAKER_04:Yeah, yeah. Right.
SPEAKER_01:So I think that's what they sort of lost me in terms of that rushed regulation. But obviously, now I'm aware they've they've now gone back to that the tried and tested route of policy, strategy, or strategy, policy, you know, whichever, but at least have a sort of guideline before then going to full-fledged regulation. So yeah, Nigeria on the right path, but uh with that one document, they sort of lost me. But they're good, they're good. Let me not uh disparage too much.
SPEAKER_02:Yeah, just a bit just to be just a bit. I think well, I mean, we live and we learn here. You you try, you test like you like to echo what you're saying, a tried and tested path that will make a lot more sense. Yeah, okay. Let's talk about your IP area, IP digital. You're a registered patent agent and IP specialist. What's a patent agent?
SPEAKER_01:It's a big term, but it just says basically what it means is that I can just basically go and register Angela's patent at Kippy in my own name. So I'll take it on your behalf to be registered as opposed to you either going on your own.
SPEAKER_02:Okay.
SPEAKER_01:And a registered patent agent also basically has the right to argue any office actions or argue any oppositions to that application as well.
SPEAKER_02:Kippy is a Kenya Industrial Property Institute. Oh, Kippy.
SPEAKER_01:Yeah, yeah.
SPEAKER_02:Right. So that means if I need to register a patent, I can go through you.
SPEAKER_01:Yeah, 100%. It'll be much easier, yeah. Uh purely because of uh Betem, you basically are accredited as a patent agent. Obviously, you sent your documents to Kippy, you're you basically become accredited, and there's an annual fee you pay every year.
SPEAKER_02:Wow. Yeah creative economy. We have a patent agent. Yeah, yeah. We have somebody who who can help you patent your amazing, amazing creations. But then with AI, now being able to generate art code and inventions, how's the concept of intellectual property evolving in the age of AI?
SPEAKER_01:Ah, it's a lovely question. Yeah, I I love that question. And why I love it is because there's a lot of debate right now globally, yeah, in terms of can an AI system or an algorithm be an author of a creative work? Yeah? Can it or should it? Can it? Should is another question. So can it be?
SPEAKER_02:So can it right now as we speak?
SPEAKER_01:And that's not the debate, right? So in uh in a number of countries, and of course, like US, for example, yeah, that debate is still ongoing, and the whole debate around it is if there's human input, right? Yeah into whatever is being created through an algorithm, can you then say that that algorithm was the true author? If your input is there, right? RNT and the author. And others are claiming actually if it's a if there's a mathematical sort of layer or a sort of new added step that the algorithm undertakes from your prompt or from your creation, then it should be the author. Now the other question then is for like trademarks, for example, if it's your trademark and you want to register and you're post, you as Angela, you can go to KP and argue your trademark. All right, and say this trademark of mine is actually this application is unique because of X, Y, and Z reason. So I need it to be registered. Can an AI system argue or it needs someone to argue on its behalf? And if someone is arguing on its behalf, then isn't that the author?
SPEAKER_02:Yeah. I can feel a small headache. So, okay, what are the Americans saying? Let's just start there.
SPEAKER_01:Yeah, so basically what they're saying is uh, and that's what I'm saying, the debate, the debate is ongoing on, especially for quantum um computing and quantum algorithms, yeah. Is as long as there's human input, then the human, you know, is still the author.
SPEAKER_02:As long as there's human input, the human is the author. It's still the author. So what defines human input abruptly?
SPEAKER_01:That's now your prompt, it is your parameters. Because of course, there's certain info that you feed into this algorithm for it to create whatever it's creating, right? That input in itself basically then sort of implies that this algorithm is not really the true author.
SPEAKER_02:But then where AI is headed, correct me if I'm wrong. Yeah, will AI be able to author itself without the human input?
SPEAKER_00:Yeah.
SPEAKER_02:Um I think we're already there.
SPEAKER_01:Yeah, it's a fair point. Yeah, that's where we're headed. And and again, that's when now, of course, quantum computing comes in and quantum algorithms, because that's already happening.
SPEAKER_03:Yeah.
SPEAKER_01:Largely as of course quantum is much more powerful than the algorithms we have right now. So we are headed there. And I think there's so I know at least in the EU and US as well, they're now coming up with certain amendments to the law or certain interpretations judicially, in terms of okay, hey, I think we can actually recognize algorithms as authors. The issue is then who gets that certificate.
SPEAKER_02:Who gets that certificate? Yeah, for example, uh would it be if it was if it emanated from Stella's Mac or Richards' laptop, but I did not have anything to do with it. But the then am I the author?
SPEAKER_04:Exactly.
SPEAKER_02:As long as it came from a computer that was mine. Or if I register a company as Rennell Labs and it grows to this exponential AI company from my lips to God's ears, and there's an algorithm generated there. Again, none of my team members or myself prompted. Yeah. But I can still say that was my my work because it used my electricity. Or how how how can you do that?
SPEAKER_01:So I could be then there, a play devil's advocate and actually say that actually, no, then that would not be your work. That would be attributed to that AI algorithm or the system.
SPEAKER_02:But the AI algorithm is Yeah, it's using infrastructure and everything. Yeah, using my infrastructure. Yes. Yeah, it's usually my infrastructure.
SPEAKER_01:But that's only infrastructure, the creation, the creative work is by that system itself. No, of course, yeah, you own it, of course.
SPEAKER_02:So then it's mine.
SPEAKER_01:No, but it's the same way, it's the same way even you right now. If you're sitting in an office and you come up with a whole new document or a whole new way of doing things, right? It's still your creation. As much as you're using your office Wi-Fi, the computer, it's still your creation. Yeah. Right? So why can't the same apply to AI algorithms? Right, for example. And then why can't that company of yours called Renel Labs then own that registration and not you as Angela? Because that company basically is the one that owns the algorithm or the system, right? Where can that company then gosh? That's a stuff very stressful. So that's where we're headed. That's now where exactly. And that's that's why for IP, it's it's a very interesting debate globally, not only Kenya, by the way, globally.
SPEAKER_02:Yeah, it would be globally because then so then that means AI, that system is its own.
SPEAKER_01:Basically, but yeah, basically autonomous, yeah. Basically it's autonomous, yeah. Exactly.
SPEAKER_02:Uh IP, people. No, no, no.
SPEAKER_01:No. That's what I'm saying. Now we need to look at those our IP laws right now and how that will be evolved to cover for that because that's honestly where we're headed. That is honestly where we're headed.
SPEAKER_02:And where's our beautiful country at with regards to reviewing IP laws?
SPEAKER_01:Yes, okay.
SPEAKER_02:Where are we?
SPEAKER_01:So what happened was 2020, uh, is now what coming to six years ago, five weeks five years ago now. Let's be six.
SPEAKER_03:Yeah.
SPEAKER_01:They basically came up with what was called an IP amendment or an IP bill. So they want to basically merge all the various IP laws. So Copyright Act, yes, the trademarks act, and what have your traditional knowledge into one omnibus act, right? That hasn't really gone far purely because all these different registrations, so Kekobo for copyright registrations, the Kenya copyright board, you have Kippi for Autance Trademarks Industrial Designs.
SPEAKER_04:Yeah, right.
SPEAKER_01:And then of course you have now the various, you know, films, producers, and music uh bodies, so Camp, your presque.
SPEAKER_02:Yeah, then you have you have the different laws governing them like that. All those different what's called music MCK Act, musical, music, some media council. Media council, yeah, yeah.
SPEAKER_01:Yes, yeah, yes, there you go. And all these different bodies, so there's now like five, six different bodies with four or five different laws, right? Merging them into one law with one basically regulator is going to is going to be very hard. It's going to take a while. So that's why that bill has been basically being put on a slow banner, purely because of the work that you'll have to go into it. It's not a bad idea because the US has the US PTO, the US Patent and Trademarks office, so it's just one office that handles everything.
SPEAKER_04:Yeah.
SPEAKER_01:So fair enough, it's a good idea, but I think a lot of thought will need to go into now emerging areas of tech, emerging areas of law.
SPEAKER_02:Yeah. And data as well.
SPEAKER_01:Yeah, data as well. How that's handled, how that's protected. Of course, granted that for that one, we have the Data Protection Act and we have the office of the Data Protection Commissioner. So that's fine. But I think, yes, they do need to look at even things like BRS and company names. And can a company name on one side be a trademark automatically, or do you need to go to both registries? So it's those small things.
SPEAKER_02:Wearing my PM hat, that's like a seven-year piece of work.
SPEAKER_01:There you go. Yeah.
SPEAKER_02:Yeah.
SPEAKER_01:So even right now it's really delayed. It's taken them a while to even figure it out. Yeah. Needless to say, people don't want to lose their jobs. It's going to be a lot of types of integration.
SPEAKER_02:Of course, but then but then because but then AI might just force. Is there a likelihood of a severe event happening in the country that is fast-tracked? And then also in addition to that, you posted an article around the recently signed virtual assets.
SPEAKER_00:Service providers. Yeah, now it's an act.
SPEAKER_02:Now it's an act.
SPEAKER_00:Yeah, yeah, yeah.
SPEAKER_02:Mm-hmm. That's touching on AI.
SPEAKER_01:Um no, so yeah, in a in a way. Yes and no. Yeah, yes and no. Yeah. So more crypto.
SPEAKER_02:More cryptocurrency. Yeah. Ah. But still, is that gonna touch on IP or copyright?
SPEAKER_01:Yes, it does. Yeah. Yes, it does in the sense that there'll be a regulator basically looking at each crypto company's operations. And of course, in that vetting process, of course, then you'll then have to maybe, and they could ask for things like obviously all the company documents, registration, trademarks if you have any, and things like that. So looking at the company's operations, of course, they'll be able to look at the companies that are heavy in terms of data and AI. Yeah. And then sort of see how to then also regulate those further. Because of course, that license will come with conditions.
SPEAKER_02:No, the reason I'm asking that is because I'm happy that as a government, as a people, we're making strides, you know, recognizing cryptocurrency because it's been there, the conversation's been there for a while. But then the enabler would be now this law that we still haven't been able to. Unpack what are the risks there? Or are there no risks or have they not materialized?
SPEAKER_03:Or I'm just am I just trying to link areas that have no business being linked?
SPEAKER_01:Yeah, no, no, no. I think the risks haven't yet crystallized. And in a sense, there's of course the data privacy risks, there's the financial sector risks as well.
SPEAKER_04:Yeah, okay.
SPEAKER_01:And data sovereignty. Because of course all these companies and platforms will collect, process heavy data, obviously transport it um offshore or share it offshore. And there's a lot of questions around, you know, is that data secure?
SPEAKER_02:Yeah.
SPEAKER_01:Obviously, you then have our Kenya cloud policy.
SPEAKER_02:Yes, we do.
SPEAKER_01:Yeah, and things like that. So yeah, that's a very concern.
SPEAKER_02:Yeah, um what came before the chicken or the egg?
SPEAKER_00:She can first, she can first, oh okay.
SPEAKER_02:So back to IP and who owns what and the author. What does this mean for protecting African creativity, you know, music design software in a global market?
unknown:Mm-hmm.
SPEAKER_01:But um recently a friend of mine actually told me there's an AI R B female artist.
SPEAKER_02:Yes, is she the one who signed to Timberland? I have no idea. These are just rumors. Can we do a quick Google search on who she is? Since we have the internet in front of us. RB artist global. Um Zania Monet. She's a virtual artist who signed a multimillion dollar record with Hollwood Media and landed a number one spot on the Billboard RB Digital song sales chart.
SPEAKER_03:Who? Okay. Yeah. She was yeah, yeah, yeah. She was created by a 31-year-old. Exactly. So who owns the copyright there? Who owes the copyright there? Yeah.
SPEAKER_02:Exactly.
SPEAKER_03:Oh, please. Yeah. Legal mind. Who owes it?
SPEAKER_01:Right. And so that's what I'm saying. So for me, I would attribute it to the creator of that algorithm. So this gentleman or lady. Because without that human input or without that person feeding in all maybe whether it's uh melodies, whether it's rhythms, whether it's the lyrics as well.
SPEAKER_03:Yeah.
SPEAKER_01:This artist wouldn't be, you know, alive, quote unquote, and creating, right? So it then becomes tricky, and then, okay, then how do you say you you own the copyright in an AI algorithm or an AI system even as a creator, right? Doesn't that platform, that algorithm, have to license it to you? Or that provider, for example, let's say, for example, for argument's sake, whether it's open AI and you're creating something off OpenAI.
SPEAKER_02:Yes, yes, yes.
SPEAKER_01:Yeah. There needs to be an agreement between OpenAI and you. There's to be something of some sort, some agreement covering off your rights. Okay, when you commercialize, when you license, who gets what revenue? What part of revenue goes to us?
SPEAKER_02:Is it revenue share?
SPEAKER_01:Yeah, what part goes to us. So many, there's so much to think about and to unpack there, even as we move into that realm. You know, Reynolds traditional music is very straightforward.
SPEAKER_04:Yes.
SPEAKER_01:Angela's an artist. Yes. You're singing.
SPEAKER_04:Who's Angela?
SPEAKER_01:So random, but random name. So Angela is an artist.
SPEAKER_03:We need to find Angela. Yes. Miss Angela. Okay, we'll call we'll call our RB artist Miss Angela. Miss Angela is an artist.
SPEAKER_01:Yes, yes, yes. Creating, she's doing her lyrics, doing her songs. Yeah. She's signed to a label. Like it's very straightforward in terms of the agreements. It's very straightforward.
SPEAKER_00:Yes, yes.
SPEAKER_01:But here now you have an a platform, a system.
SPEAKER_02:Yeah.
SPEAKER_01:Which of course, again, has human input. How do you draw those contracts then? And if they're going to like a publishing company, one of the Kenyan publishing companies, who are the parties to it? Is it the system, the algorithm, you and that publishing party? Is it a four-party agreement? So all those issues are obviously now being sort of thought about quite deeply. Since basically we're now alive to all the AI capabilities.
SPEAKER_02:Yes, now that we're alive. You know, I remember when we launched the AI strategy, because we we worked on the team together, and I remember we talked about it a lot, and then, you know, there would be a LinkedIn article here and there. But the more I read about it, the more I read about our conversations, and through previous episodes, I'm thinking, this monster is growing. And we've barely scratched the surface. It's all well I'm going to talk about it and be a subject matter expert. And I'm very supportive of people knowing more about AI. But conversations like this with you, I'm thinking, do we have the right skill set? Like now, like you guys, more lawyers sitting at the intersection of AI, technology, and the law, because what you've described means we need to have a lot more resources and talent around individuals who are can who can unpack such quagma as do we have that? Do we uh is there a push to have more legal techies? Yeah.
SPEAKER_01:No, there is 100%. I think I think when in policy, we're seeing a bit of upskilling courses, yeah, or webinars, of course, but I think more of courses and like policymaker courses.
SPEAKER_04:Yeah.
SPEAKER_01:Just to bring it on in terms of even when we're drafting strategies and policies, yeah, what do we need to consider in our local context? Right. And not just having experts in the room for the sake of having experts, but these experts actually have the right um mindset towards policy.
SPEAKER_02:Mindset and content and skill set, yeah. Because it's every every day there's something new. Yeah. And so if we're not careful, if you're not very cautious as a people and as a country, again and as as a continent, we'll be caught to use your words, flat-footed.
SPEAKER_01:Flat footed, yeah, once again.
SPEAKER_02:Oh my god.
SPEAKER_01:And so now do you see why now I sort of highlighted Nigeria in terms of Russia to regulate to an AI bill, yet this thing is changing every day.
SPEAKER_02:Every single day there's a change.
SPEAKER_01:Because last year this artist, the female RB artist, didn't exist.
SPEAKER_02:No, she didn't. She she she I mean, this article is from September.
SPEAKER_01:There you go.
SPEAKER_02:And by this time next year there will probably be quite a number. Then how do you regulate that?
SPEAKER_00:How do you, yeah?
SPEAKER_02:Hey, I might see uh we might just see one singing Gangeton.
SPEAKER_00:So I'm saying like it wasn't even going to speak. Afro bits one. Yeah, yeah, yeah. Yeah, why not even get on?
SPEAKER_01:Of course. Yeah, yeah. But you know, that person who feeds in all our shen, all our slang.
SPEAKER_04:Exactly.
SPEAKER_01:You know, all that. Don't you have to credit that person who's feeding, you know, all that info and data into the system, including the beats and the melodies, all of that.
SPEAKER_02:And and that person, are they being compensated well? There you go.
SPEAKER_01:There you go. Those are now the huge questions that are now facing. And I think that's more for the entertainment industry, and of course, entertainment lawyers, which I I do. I also do entertainment lawyers.
SPEAKER_02:Oh, you do? Okay.
SPEAKER_01:But some of the things we're thinking about in terms of that contractual element to it. Can an A algorithm sign a contract? Of course not. No. So that you know, so there you go. So the question of ownership then falls away. The owner has to be the human, right? At least for now.
SPEAKER_03:At least for now. No, not for now, for forever. Forever, forever, ever, ever.
SPEAKER_01:Yeah, no, no, so it's it's really ever-changing. Like it's quite good to think about. And I think also let me highlight and sort of plug also one of my friends called Judge. Uh, right now she works at GIZ. Okay. But she's also assisting on the ESC East Africa Community AI strategy and alliance.
SPEAKER_02:Yes, yes, I saw that. Yeah.
SPEAKER_01:Yeah, and they are thinking about now policymaking now to your point. Sort of calling all the legal experts, technical experts, innovators, founders into a room, do a short training or a short course on policy making. Yeah. Then have that role out across in West Africa, across the borderless in the ESE countries, for example.
SPEAKER_04:Wow.
SPEAKER_01:And so is that something that countries should start thinking about and considering. Even as, you know, as you as you as you talked about it, even as we do our own readiness assessment matrices, we localize, you know, obviously all these policies to our context. That's something I think we should think about as well.
SPEAKER_02:So we do have people who are being proactive and thinking and asking these questions because it's all of them good to talk about AI and how it will change how we live, but the nitty-gritty. Yeah. Wow. Now we know AI is a product of data, but there's a it's there's a legal aspect of it. Yeah. We've seen a wave of data protection laws. We have our own spearheaded by the office of the data protection commissioner. We recently celebrated, was it five years and won a massive award globally? So we're good. Kenya, we're good. But how effective are the rest of the data protection regulations? From what you've seen, or maybe effective may not be the word. How stringent are they? I mean, I really liked what Nigeria did with Meta a few months ago where they said they slapped them with a with a fine to kind of teach them you can't do whatever you want. Are we seeing the same behavior across the continent?
SPEAKER_01:Um yeah, it's a tricky one. Uh because of the geopolitics. Yeah, yeah, yeah. You don't want to annoy um, you know, the global north as it is, or our big brothers as it is. But to be fair, there's that geopolitical angle to it, right? In terms of regulatory approach. So do you want to quote unquote piss off um these huge multinationals and have them um sort of pull out of our countries? And how do you balance that with enforcing our local laws, right?
SPEAKER_02:I roll, I roll moment. How are we pissing them off? Because Meta, from what I read, and I could be wrong, and I'll I'll investigate that article further. They slapped, no, the Nigerian authorities opted to fine or a push to find Meta because they did not comply with their data regulation laws. So how are you gonna piss somebody off like that? What what what is that uh you you you don't comply, I fine you. But but because of geopolitics, and because we don't want to piss off big brother, and because, oh, you know, we're the ones who are hiring your creatives, we're the ones who are doing this.
SPEAKER_01:We're consuming with a lot of questions.
SPEAKER_03:We're the consumers. So the the line is a bit blurred there, isn't it?
SPEAKER_01:No, it is. To be fair, it is absolutely.
SPEAKER_02:And is it fair?
SPEAKER_01:That's a good question. It's a good question. In terms of yes, the line is blurred and and of course that approach in terms of regulation has to be very balanced. I feel like the EU can afford to do that because the EU has been very aggressive.
SPEAKER_02:Um can afford to do what exactly?
SPEAKER_01:The very, very aggressive fines, where they're looking at five, ten percent of your um basically, you know, net value or gains or profits or other parts of the city.
SPEAKER_02:So the EU can afford to fine the big tech.
SPEAKER_01:No, Africa can, don't get me wrong, Africa can. But the context is that for EU, you you you then find a lot of these multinationals also have operations in the EU. Right. So in a way, it's like, okay, it's it's yeah, you so you're at a net zero because yeah, as much as yes, the fine looks like it's applying to an American company, they have huge operations in the EU. So of course, yes, they do that, and and of course, in the African context, and Nigeria as well. That is not wrong at all. Um I think my approach, and maybe this is just me being um, maybe this is my temperament. I usually like that collaborative approach whereby, yes, I'm non-compliant, right? I haven't done ABCD things, right? Why can't we then have a regulatory engagement with a view?
SPEAKER_02:Ah, regulatory engagement with a view.
SPEAKER_01:Yeah.
SPEAKER_02:Okay, I've messed up. I've circumverted all your laws. Then you're gonna invite me to a table to have an engagement.
SPEAKER_01:Yeah. Okay. To explain, because most times, more for the notes. It depends sometimes.
SPEAKER_02:Right. Interpretation is a problem.
SPEAKER_01:No, no, no, it's not a problem. I'm just saying at times, at times you'll then find that because the size and scale of the operation, okay. Some, honestly, and this is the thing, sometimes whether it's a cross-border transfer, whether it's a sort of breach going wrong or something like that, at times there's contextual maybe background to it.
SPEAKER_04:Oh, okay.
SPEAKER_01:There is maybe an in law we call them mitigating factors, right? You quite you can mitigate when called to a tabernacle explain and say, actually, hey, exactly how our model works. So as opposed to what we've seen online, exactly what we have in the back end.
SPEAKER_02:Okay.
SPEAKER_01:So if this and this happened, maybe it was human error in one aspect, maybe it was a rogue employee in the other aspect, maybe it was, you know, maybe it was something with the cloud, right? Because we've seen cloud out outages, we've seen network outages, we've seen all these sort of things, right?
SPEAKER_02:It could be a technical issue. So the regulatory exchanges for so basically come and explain yourself.
SPEAKER_01:It helps, yeah. And then if that explanation isn't then satisfactory, then you can issue the massive fine. But at times what tends to happen is it's a sort of reactionary fine. Yes, it's good to be active and to be single proactive. Yes. It's good to actually um keep people on their toes. But I think at times, you know, in our context, maybe we're too quick again to penalize, right? And maybe that goes to the you know the common law approach or our approach, which is very adversarial in nature. Of course, people usually say Kenyans were very litigious. Yes, us people, right? So maybe that is not that obviously then feeds into the regulatory approach, which I think for tech, and this is what I'm saying, for tech, and again, not to disparage regulators, but I've seen a number of scenarios where by the regulators, yes, as much as they're there to regulate and they have an office, their understanding of the underlying tech, the actual underlying tech bit behind it and how it works, right, right. It's a bit limited. So it needs that engagement, that exchange to then understand and go like, oh, okay, oh, this is what you meant here, this is what you guys do here. So that's the angle where I was coming from. I'm not saying look, no, no, I I totally understand.
SPEAKER_02:I totally understand. It reminds me of when there was a conversation around shutting down TikTok entirely in the US and to your approach Even Kenya.
SPEAKER_01:Even Kenya we had issues.
SPEAKER_02:Yes, actually, yeah, I don't know why I'm going so far out.
SPEAKER_01:Yeah, but carry on, carry on with me.
SPEAKER_02:And then it was was it President Trump who now took your approach of let's meet, let's talk about this, yeah, and then come to an agreement because yeah, because I remember just seeing creatives crying. Yeah, that's a whole economy. That's a whole economy. So what's the point? Oh, well, when you put it that way, okay.
SPEAKER_01:Because US would have easily banned them and you should have. He could have, right? He could have. Yeah. Yeah, but obviously they had their issues went to Senate and all that. But eventually now you can see there's some sort of dialogue, right? Given there's a geopolitical spin to it, but at least there's dialogue.
SPEAKER_04:There's a dialogue, Kenya as well.
SPEAKER_01:Kenya, I think you remember when the president obviously had the famous Zoom call with them, you know, and then they promised to be compliant in Kenya, content moderation.
SPEAKER_02:Yeah.
SPEAKER_01:Where'd the same issue in Kenya?
SPEAKER_02:Yeah, we did, and recently, not recently, I think a month or two ago, content creators were invited to the principal secretary of MICDE, that's Ministry of Information, Communication and Digital Economy. Oh, such a mouthful. And I think there were good conversations there. I think there were conversations with Meta. Yes. Okay, so when you're dealing with the law and technology, from what I'm hearing, dialogue is very important because you have the tech heads, and then you have the legal heads, and then you have us who are in between policy, a bit of this, a bit of a professionals.
SPEAKER_01:Yeah, because television from where I sit, like I've been in a number of those engagements, obviously appeared both parliament and senate for a number of these clients. Yeah. And usually that's the gap I tend to see, right? That's really the gap I tend to see. And I think that's when in Kenya, the president took that approach because he realized the scale and volume of their service and product in Kenya. So leave alone the content creators on the platform, right? Right. You have content moderators seated in different offices in Nairobi.
SPEAKER_03:Yes.
SPEAKER_01:That's how they're working, that's how they're making it. They're moderating Kenyan content, translating local dialect, taking down videos that are offensive, taking down uh harmful content. Yeah. So do you want to kill all those jobs? No. You know, the BPO sector.
SPEAKER_02:No, no, no, no.
SPEAKER_01:You know, so it's a lot of prostitution, and you know, that's what we began from the point of yes, the law is always jogging behind like an innovation for that particular reason. We just need that dialogue because if you're catching up to me, you need to understand why I'm ahead of you.
SPEAKER_02:Yeah. Yeah.
SPEAKER_01:So there's no you can find first and then now talk later. No. No. It's engaged. Yeah. Then if really my the gaps in what I'm doing are so wide, then then you can issue a file.
SPEAKER_02:Then you can issue a fine. So get to understand my operations and then meet in the middle.
SPEAKER_04:Yeah.
SPEAKER_02:And then, of course, there's the the hard no, no, no. Yes. Okay, okay, okay. Oh, okay, I've learned something new to well. I've always advocated for what you're saying, but then sometimes when it comes to big tech, I'm like, mm-hmm.
SPEAKER_01:Yeah, yeah, that's the yeah, that's the angle. Yeah. So I mean, yeah, yes, we're consumers and yes, of course, but of course we have all rights. Yeah. But I think that balance usually needs to be um carefully.
SPEAKER_02:Yeah, it needs to be carefully managed. Right. Now, data sovereignty. Should our data be processed and stored locally? Is it realistic or does it risk isolating African innovation? I think it should stay local. But I don't know what you think.
SPEAKER_01:Actually, I need to know the reasons why you want it to stay local.
SPEAKER_03:Because it's me. I'm in my womanla mumbi era for Africans with Africans by Africans.
SPEAKER_00:And for the listeners who cannot see Stella's hairstyle. I wish you could see Stella's hairstyle, but anyway.
SPEAKER_03:I'm always rocking tarot shells, guys. That's me. Yeah.
SPEAKER_01:Yeah.
SPEAKER_03:That those are my reasons, plain and simple.
SPEAKER_01:Fair enough, fair enough. But I think, again, it's it's a global village, right? Yeah. There's a lot of movement, whether it's travel, whether it's information, whether it's data, whether it's everything we're consuming and doing, it's it's not global sphere. We can't sort of isolate as Africa and say we'll keep our production local, keep our data local, simply because even even for, let's say if I'm coming up with my own software app, for whatever it is, whether it's again the usual stuff, food delivery, whether it's um, you know, transport, I will need data sets across, you know, across the globe if I want to scale, for example, right? And that global aspect to tech, for example, then means that data should be freely transferable. Of course, with the right parameters, with the right safeguards in there, which is why again, a number of I think right now, to be fair, globally over 150 countries, if I'm not wrong, have actually passed a form of or a sort of data protection law, guideline, policy. Yes, yes. Yeah, but every country is not thinking about data protection, right? And the angle is we actually for us to also scale and thrive, yeah, we don't need data from other countries. Yeah, whether it's training, whether it's analytics.
SPEAKER_02:Whether it's uh cross-border data flows and you're again to back to your point around BPOs, you know, you wanna you wanna process something, uh not something, but you wanna process a claim. You need you need that person's data, so it needs to come this way.
SPEAKER_01:Yeah, okay. Yeah, so I think the key now is actually, and I'm happy that most countries are doing it, of course, Africa-wide and even just globally. Yeah, and that having a shape or some form of data protection law, guideline, something, you know, and of course, it was sparked by 2018, the EU GDPR then led to that sort of cascade effect. Yeah. I think one thing I'll add, of course, to our work on the AI strategy, which of course Stella and I were part of.
SPEAKER_03:Yes, we were.
SPEAKER_01:One thing I'll add to that is of course, so this year what what happened is that the ODPC, um, administrative ICT and the digital economy are now coming up with a data protection sort of policy.
SPEAKER_02:Yes, yes, yes.
SPEAKER_01:So we have the act. Yes. Of course, in fairness, the policy maybe agably should have come before the act.
SPEAKER_02:But um listen, those arguments we've been in those rooms together. Why is there a strategy not a policy? Why is there a policy not a strategy?
SPEAKER_01:So but we are where we are.
SPEAKER_02:Yes, we are where we are.
SPEAKER_01:Yeah, and at least I know this policy, of course, has been largely informed by this our you know, obviously, the AI strategy in Kenya.
SPEAKER_02:It was, it was, yeah, it's bearheaded by our favorite professor, Professor Wyema. There you go.
SPEAKER_01:Yeah, so so that's that's in the works. Yeah. So I think also that now, also now, um, and I know for a fact they're looking more at huge data sets, yeah. Siloed data sets. So in terms of AI, whether it's um multinationals using that data, transferring it, how they use it.
SPEAKER_04:Oh yeah.
SPEAKER_01:And even just for Kenyan government, um, this siloed data, is there where we can actually have a proper government data set whereby different companies can leverage off it. Yes. Whether it's uh flood uh forecasting, weather prediction, analysis of having data, exactly.
SPEAKER_02:Agriculture, health as well.
SPEAKER_01:Yeah, you know, yeah. For example, like COVID would have been much, much better off if there's a one unified database showing sort of mapping. Or like the heat map. Yes. Yeah, for example, or even or even just vulnerabilities in terms of where's the vaccine needed most, how can we roll it out? Yeah, yeah, and even just a heat map over, yeah, uh, in terms of positive cases, things like that, all those things.
SPEAKER_02:Let me ask you something. Yeah, you've raised a very good point, yeah. All this work, when we do get to the point where we have data sets, who's gonna be hosting how that data is collected? Because the last thing we need is a USAID kind of moment where funding is pulled, you can no longer have access to your data. Yeah, yeah. So where are we balancing all this? Is it happening?
SPEAKER_01:Yeah, yeah, no, we are. And of course, um and and the work on that around the data protection policy framework for Kenya has kicked off. At least that it will be hosted under ODPC.
SPEAKER_02:Yeah.
SPEAKER_01:Good. Yeah, which is uh fair enough.
SPEAKER_02:Yeah. And I think
SPEAKER_01:And again, just just like you're talking about earlier, um you said it's a it's a Mumbi moment, Mumbi?
SPEAKER_02:I mean uh yeah, I mean my Mumbi era. My Mumbi era, exactly. So my middle name Mumbi.
SPEAKER_01:Let's have a regulation framework that's made for us, that works for us, right? And I think at least each country should have a sort of ideally a sort of framework that actually governs that data so sovereignty in that aspect. But I think for innovation multinationals, tech companies, there should be a flexibility in terms of they can actually transfer our data um offshore.
SPEAKER_04:Yes.
SPEAKER_01:With the promise that they have the appropriate safeguards, you know, technical organizations.
SPEAKER_03:Guardrails, everything, yeah.
SPEAKER_02:Because it's uh it's a data is a very it's a sensitive topic. It is, it is, it is. Um even before tech became technology, if you look at if you read some of our history books, you know, a lot of what I mean there's a book I'm reading, um, what's it called? Anyway, it's about colonial Kenya. And and there's a lot of rich data in that particular histories of the hanged, that's the title of the book. And I and I and I started wondering, I said, where is this data? If I was to look for it, is it in the archives? Who hosted and and and the recollection in the book, is it a true reflection of what really happened that period? You know, they say history is always often written by the victor, never the never the loser. So, you know, if you think of it in that context and then in today's context, the sensitivity around it, and I'm and I'm happy. Are you part of that data policy framework conversation?
SPEAKER_01:Yes, and so we need the initial engagement. So I think when when they're forming the team, hopefully I will be interested. Okay, good.
SPEAKER_02:Okay, fingers crossed, definitely. Okay, good, yeah. Because that's uh it's driving my Mumbi era. I'm like, no, this data, it's ours, it's ours. But listening to you and taking the logical approach, we're a global village. Okay, fine. I'm I'm with you.
SPEAKER_01:Like even Stella, like when you travel, for example, let's say you're going um overseas abroad, whatever it is, right? And you present your passport at immigration. Yeah, you want to hope that actually that immigration, whatever country it is, yeah, there's some back-end collaboration with Kenya.
SPEAKER_02:There will be.
SPEAKER_01:Whereby once they scan your passport, they're like, Yeah, they can go from here. She's Kenyan, this passport is valid, please go through.
SPEAKER_04:Yeah.
SPEAKER_01:You see, and that's just one aspect to it. So leave alone, even our computers, our phones. Just that. That's data as well, right? Global village. And that's where it all starts, given that airports now have they all have facial recognition, right?
SPEAKER_02:Okay. Things like that. Okay, fine. We're together. No more eye roll moment.
SPEAKER_00:I had to convince you.
SPEAKER_02:No, no, I've no, I've always been convinced. Honestly, I've always been convinced. Like it's just when you when you bring in the security angle, of course, there has to be data flows. But then last year's NADPA summit, National Association of Data Protection, I don't know what the agencies authorities.
SPEAKER_00:Yeah, yeah, yeah.
SPEAKER_02:Somebody said we need to decide what data is secure and needed for the security agencies. And I mean, just classify actually, yes, the word you use was classification of data and its users. So when you bring in the security aspect, of course, there needs to be data flows. Are we classifying what data goes where? Are we there yet?
SPEAKER_01:We're getting there. Now, for Kenya, even under our act and the regulations.
SPEAKER_02:Yeah.
SPEAKER_01:So there's actually data or information that has to be stored in a local server. Yeah. Or a copy at least in a local server in Kenya.
SPEAKER_02:Okay.
SPEAKER_01:So data on elections, health, financial sector. So, for example, a company like Safaricom has to have a local copy here. Yes. Really, because of their government integration within PSA services, ECTZN, BRS, all those things, right? Yeah, so health, financial services, education, is education then? Education as well. At least under the regulations, exactly. Okay. Whereby a local local copy has to be here, at least servers in Kenya, right? Now and I think obviously the intention behind that provision was to then allow the normal tech companies, your usual blue chip multinationals, to be able to transfer that data without each of them having a server here or a local copy here, because that would just be too onerous for them. Imagine if a company like, for example, white example, IBM or Microsoft had to have 290 servers in each country globally, they'd shut down. That wouldn't work.
SPEAKER_02:Is it not financially sustainable? Uh environment for the environment as well. Is it sustainable? Yeah. Okay.
SPEAKER_01:So that's right. Now in terms of that localization aspect, I think it should stick to government services, whereby this data that's needed for government services. Yes. It's critical just to stay in Kenya, no problem. Yes. Yes. But then the rest of the private sector companies, you know, you can have your cross-border flows.
SPEAKER_02:That's the question I'm asking. Yeah, just restricting. Yeah, yes, yeah. So the distinction. For example, like yes, yes.
SPEAKER_01:Yeah. So like ABC, for example, with the voter registration, again, biometric uh data collection. Of course, there they should have a copy in Kenya, a local server in Kenya. 100% agreed with you. Do they have that? It's a good question. I'd like to hear from the CEO and the chairperson.
SPEAKER_03:The subject matter expert.
SPEAKER_01:Given that I'm not the legal counsel on record, I'd like to hear from them actually. I'm not confused. But you see, no, that's that's not the distinction, right? Yeah. Whereas, for example, Microsoft, you shouldn't tie them to having Kenyan data.
SPEAKER_02:No in Kenya. No, they're not. So I think through this conversation, I think you've made me realize that's what I'm advocating for. Classifying the data and what goes where, who owns what, what needs to stay where for security purposes. Yeah. Yeah.
SPEAKER_01:So actually to your point, I think we're actually there. I think in Kenya we're actually there. There was classification, yeah.
SPEAKER_02:Okay.
SPEAKER_01:Yeah.
SPEAKER_02:But we could do better. No, no, not disparaging the government. It's just, I mean, from the start of this episode, things are changing so fast. Yes, yes, yes. As we talk AI, the data conversation as well needs to be have that level of rigor around it. Around where's the data going? Okay. Now on regulation, responsibility, and corporate governance. Gosh, that's a mouthful. So when you advise companies, fintech, gaming, telco, what are the most common compliance or ethical blind spots you see around new tech?
SPEAKER_01:Wow, yeah, that's uh yeah.
SPEAKER_02:And you don't you don't have to name a company, of course. You're not obliged to name a company. At least don't.
SPEAKER_01:Yeah, yeah.
SPEAKER_02:They'll write a cease and deceasing. Yeah.
SPEAKER_01:No, no, of course, you know, but yeah, no, but it it's a very good question and very thought-provoking. Of course, depending on the operations, yeah, what have seen that it tends to shift and change from either compliance with local policy and frameworks, generally in terms of licensing, in terms of you know, do you want to be in that gray blurry line where we don't need a license if there's if they're licensed, for example, or can we piggyback off a local player? The other aspect is, of course, the data aspect to it, whereby most times our companies are fairly reluctant in terms of localizing their internal policies, procedures to the each country's local laws. And each country has different requirements. The countries that require each register as a matter of fact before even operating.
SPEAKER_04:Yeah.
SPEAKER_01:Some countries it's more of a compliance point. You can choose to, you can opt to, more or less 50-50, depending on your operations, as long as you have policies in place. I think the other gap that I usually tend to see is again that policy gap in terms of government um and stakeholder engagement.
SPEAKER_04:Yeah.
SPEAKER_01:Which tends to happen, especially when they're very aggressive to go to market and very aggressive to, you know, opposite. Take your money and sign up and whatnot. So at times that then tends to be and that's not legal per se, but it's more from an operational standpoint. And I understand, of course, startups at times you're very aggressive in terms of you want to go to market, you want to launch tomorrow, you know. Yes. And so you tend to sort of have this approach of look, we'll figure it out later.
SPEAKER_02:Let's just launch, go to market, we'll figure it out later. The MVP.
SPEAKER_01:Yes.
SPEAKER_02:The MVP stage, yeah. That's just how it is. Yeah, exactly.
SPEAKER_01:Those are some of the pitfalls that I've observed routinely.
SPEAKER_02:Yeah.
SPEAKER_01:Governance, not so much because most times governance has to be sort of, you know, set in stone before they even get or attract funding, whether it's a series A, series B, series A.
SPEAKER_04:Yes, yes.
SPEAKER_01:Most times governance, it's more or less fairly there. The only aspect now to governance that I'd highlight is certain sectors require those local directorship requirements or local content requirements. Of course, for telecommunications, that was done away with. For teleco companies, that was done away with, but I know financial sector, you may need local content. You know, insurance, you need local content in terms of directorship on board. So I've seen a couple of cases where they want to overlook that and sort of can we have a local rep or can we just have someone on ground but not as they have them on the board because you know we don't really appoint them like that? And all those, all those considerations, yeah. Yeah, yeah. Yeah. So mostly operational as well.
SPEAKER_02:So, how do you then uh what would you suggest would be the best approach in bridging these gaps that you've observed? Is it more having more corporate lawyers, having more regulators, or back to what for what I'm hearing from you is having the dialogue to assess these gaps and seeing a best way forward? Is that how you do it?
SPEAKER_01:Yeah, 100%. Okay. I think that would go hand in hand with uh maybe an ESG specialist.
SPEAKER_02:So what's an ESG specialist?
SPEAKER_01:So environmental, social, is it called environmental, social?
SPEAKER_02:I just know G's governance.
SPEAKER_01:In governance, yes. I think yes.
SPEAKER_02:Ah, we have specialists.
SPEAKER_01:No, that's correct. And governance, yeah. So your specialists what we basically advise companies on that, how to comply, frameworks. Now, in my experience, it works best when they work with policy leads or the head of policy or government affairs, for example, right? Right. That's when now when you marry those two, you then have that feeding into legal. Because most times, yes, legal whether to yes, we'll look at what the local laws say and advise you on, whether you're compliant or not. But the three for me have to work together. So whether it's ESG, policy, and legal, that has to work together. If it's one person doing the three, no problem. If they can handle it, are they able to handle it? Yeah, but it has to be a consideration at least.
SPEAKER_02:Because those three sound like three totally different disciplines. Okay, and have you are there companies that you've seen those three roles working together effectively? And you've thought, yeah, I like this. And we we we're doing that in Kenya and I guess on the continent.
SPEAKER_01:Yeah, Kenya as well, continent as well, of course. For the multinationals, they'll then hire these three roles in Africa for the African operations, which is a good thing. But I think more and more, at least for the Tex Press, and since the topic is you've obviously AI, innovation, those three have to work together in my view. Because again, with the AI aspect, you have like we're talking about obviously local training, innovation, whether it's TVTs.
SPEAKER_00:Yes.
SPEAKER_01:The ESG aspect comes in terms of data centers, energy, whether it's cooling facilities, whether it's, you know, are we hosting servers here, and how does that fit into the larger operation? Is it in an SEZ? Is it you know our operations tax-free? All those considerations, all those. It's a wide scope of things to consider, at least for AI. Because you know, it's all well and good talking about compute and cloud, but the infrastructure is actually it's I know, right? It's weak and water infrastructure.
SPEAKER_02:It's you know, yeah. You you recently, you recently published a very well-written article around data centers, their co-location, and back to your point, is all well and good to the buzzwords. Yeah, yeah. You know, cloud, this quantum, this it's the year of is the year of, but mm-mm.
SPEAKER_01:Yeah. Where's your infrastructure? Exactly. Yeah. Yeah. Where is it? Um are the agreements in place? Yeah. Is it environmentally uh friendly? Is it compliant? I recently saw a clear an article, I think, China. Yeah, they're actually having sort of data centers undersea. So the cooling is done obviously with the water, and so again, environmentally friendly. Is that a consideration? As opposed to having it in our industrial parks. Can we think of, you know, you know, underwater cooling, for example, and things like that? And that's what I'm saying, like it's it's evolving every day. It's moving every day. Yeah.
SPEAKER_02:But they've had the experience to try and test that out. But I guess for us, it's just having the willpower and willingness to do it.
SPEAKER_01:To do that, yeah.
SPEAKER_02:Yeah, yeah. Okay, I like that you advocate for everybody to work together.
SPEAKER_01:Collaborative, neutral. There's a I think those personality types, I think. Is it ENTG or whatever? Anyway, yeah. Yeah, I'm I'm a diplomat basically.
SPEAKER_02:Yes, you you're yeah, you're very diplomatic. Yeah, but but but I I see I see how you're winning all these awards. They're always now. I'll be walking into rooms now and be like, what would Richard do about this?
SPEAKER_03:You know, I mean my Mumbi era, but what would the legal mind say about this?
SPEAKER_02:Now, my favorite, and when I say favorite, I'm saying this sarcastically. AI ethics. What uh AI ethics? You know, there's AI ethics, regulation, being led by the global north. Where are we fitting in this? Do you know? I let me tell you, the conspiracist in me genuinely believes somebody somewhere is creating this. Let's talk AI ethics so that if we do not comply, we'll not get access to NVIDIA chips.
SPEAKER_03:That's the conspiracist in me. I'm a professional, but I'm allowed to be a conspiracist. Because what are AI ethics? How do you talk AI ethics?
SPEAKER_02:Well, we've not even had time to understand what AI is. Now you want to lump something else. Or we're African, we'll figure it out. Yeah, yeah. Mike, please.
SPEAKER_03:We using your diplomacy hands.
SPEAKER_00:No, but I've never been a fan of conspiracy.
SPEAKER_03:That's what I am.
SPEAKER_01:No, but you know, of course, we usually tend to try and figure out a way to understand something that's yeah, something that's really, really important.
SPEAKER_03:What's the driving force? Like AI ethics. Uh are we even framing the AI ethics?
SPEAKER_01:Yeah.
SPEAKER_02:That's why I said my favorite topic. Yeah. Quote unquote IRO.
SPEAKER_01:Yeah, basically. Do I think AI ethics, the angle again would be to protect consumers, or to protect users, right? And when you're talking about things like fairness, accountability. Okay. You know, you're talking about use case guidance, you're talking about things like bias, right? And again, for example, look at newspaper articles or newspaper, newspaper and media companies. When an AI platform algorithm scripts articles, information from the internet, trains the models on that, and then gives people output based on basically copyrighted material, right? Because news is copyrighted material.
SPEAKER_02:That's straight-up plagiarism. Yeah, and that's not fair at all. Yeah, right.
SPEAKER_01:And we've seen that back and forth between these media companies or media platforms and our AI platforms or algorithms and creators, right? And that back and forth, yes, you do need the data to obviously develop and enhance the systems and enhance the output and product, but accredit the sources, yeah, or at least, you know, yeah. Yeah, basically list the authors where it's due, state where your source is, state where you got it from, and things like that, even for the consumers to then see and use. So the ethics angle really comes in terms of consumer protection. And I think also obviously protecting now the companies from which this data is derived, and then models are trained on that data. So again, accountability, fairness, and things like bias. So, you know, unpopular topic, but companies, a number of them obviously have obviously rolled out things like responsible AI principles, responsible AI guidelines, you know, even if you're an entity and using AI, these are our AI models, you know. I've seen them, I've read them. Yeah.
SPEAKER_02:Ma'am, I was here guns blazing, AI ethics, but then you just come in and make sense of everything.
SPEAKER_01:No, but it sounds like a no, and I agree with you, it sounds like a buzzword until and again, I know those media companies like in US, New York Times, ETC, they've taken these guys to court, they've taken them um, you know, head on for massive amounts purely because of that, you know, just that fairness aspect of it.
SPEAKER_02:Yeah, you're right. My query then is are we creating our own frameworks or are we relying heavily on environments that have experienced issues around fairness and responsible AI? Or are our ethics framed around use cases? Because not everything that will transpire in the global north will necessarily materialize here.
SPEAKER_01:Yeah, yeah.
SPEAKER_02:Where's the balance?
SPEAKER_01:And you're right, even for like large language models, at least in the Kenyan context, if you're training an algorithm on, for example, languages in Kenya from the over 40 tribes, right? You know, how is that accredited? How is that done? How do you then sort of acknowledge all the authors, contributors, and things like that? How do you even verify accuracy, right? And I think where we're getting there, obviously, is the whole, you know, at least for AI strategy that's been addressed.
SPEAKER_03:Yes, it has, right?
SPEAKER_01:At least for our context. Let me now bring it back to our context. Again, taking it back to Africa-wide, you have the AU strategy, the AUAI strategy, which also addresses things to do with fairness and ethics around AI and responsible use. Granted, a lot of the buzzwords and a lot of the terms are still general, which is quite generalist.
SPEAKER_04:Yeah.
SPEAKER_01:Um, in the sense that we still view ourselves as consumers, obviously from the global north. But I think as the years go by and as the months go by, we'll see a bit more in terms of policy guidelines and documents, in terms of now how do we then actually localize these ethics for us.
SPEAKER_02:Yes. Okay, good. Okay. As time goes by.
SPEAKER_01:Yeah, it will happen to us.
SPEAKER_02:It takes too much time. Yeah. And if it takes too much time, I will I will look through and say, But on the right path.
SPEAKER_01:Oh, how much is too much time? How much is too much time for you?
SPEAKER_02:I don't know because you know, at the start of this recording, you've admitted it and we've seen it, AI is moving at a very fast pace. And I don't think AI is waiting for anybody. So it's for us to think, okay, this tech is not waiting for anybody. So we need to be very proactive. But in our proactivity, as Africans, work with what we know and create frameworks around our use cases. This my argument has always been adopting everything from the global north may not necessarily be the best.
SPEAKER_01:No, no, I agree, I agree, I agree with you a hundred percent. There's a need to localize.
SPEAKER_02:There is, yeah, there will always be a need to localize. Yeah, yeah. Okay.
SPEAKER_01:Because, like, for example, there's that chatbot that we have for BRS, which we know, which has obviously some Sheng capability.
SPEAKER_02:BRS, all business registration services. Okay. Uh-huh.
SPEAKER_01:Yeah. There's a chatbot, it has Sheng capabilities. Of course, it has LLMs that went into it, right? In terms of Sheng and Kiswahili. You know, things like that. Because obviously that came before the strategy. How then do we think about that in the years to go? In the years to come rather, if KRA developed one, the Kenya Revenue Authority develop one, for example, or DPC have one as well. So if the different regulators start rolling them out, we then need that framework that applies for Kenya, at least in terms of ethics, ethics, ethics.
SPEAKER_04:Okay. Yeah.
SPEAKER_01:Noting that, of course, there's no point having an AI regulator. You can't really regulate it. No, I don't think you should.
SPEAKER_02:Have you ever I'm gonna wrap up now? Okay. What's one piece of advice you'd give to young lawyers or innovators entering this space? But me, the way I'm seeing your career path, you you you'll be the legal tech ambassador at this rate. But while you're still practicing law, you're very kind, you're very kind. Whilst you're no, but you're very diplomatic. So whilst you're still in that seat practicing law and answering your various emails, what's one piece of advice you'd give to young lawyers?
SPEAKER_01:Yeah, no, I think just um, you know, identify your niche very early on. Which you did. Yeah, which I did.
SPEAKER_02:And you're very passionate about.
SPEAKER_01:Very, 100%. I enjoyed it. Like usually I can't wake up, I can't wait to wake up, usually. Like I'm sleeping, but I can't wait to wake up to start doing what I do. That's that's how much I love what I do, honestly, on a day-to-day. That's amazing. Yeah, and that's why, like you said, I never set uh foot a day in court because that wasn't my passion, right?
SPEAKER_04:No.
SPEAKER_01:Neither was mergers and acquisitions, neither was tax or projects. It was just tech law, right? Yeah. So I think just have your niche early on, be passionate about it, invest the time in it.
SPEAKER_04:Yes.
SPEAKER_01:Um, invest time in yourself, read widely, you know, just educate yourself as well. No one knows it all.
SPEAKER_02:No one knows it all. And AI does not know it all. So do not leave it to AI. It hallucinates as well.
SPEAKER_01:So it's so crazy. But I think that's that's the main thing. Invest time in your craft as well. But the key thing is just identifying your niche early on because you don't want to disappear into this pool of lawyers, of techies, of doctors, of engineers. You know, you need to have a niche. Yeah, and again, the thing, every profession now applies AI to some degree. Yeah, engineers, quantity surveyors are applying AI to their designs to automate their designs, for example.
SPEAKER_02:Content creators, content creators as well.
SPEAKER_01:So I think identify your niche, be passionate about it, stay humble, willing to learn.
SPEAKER_02:Stay humble and willing to learn.
SPEAKER_01:Yeah, yeah. You can't go wrong with those ones. And of course, um, the hard work there, you have to put in the hard work.
SPEAKER_02:AI cannot do everything for you. Oh gosh, thank you. Yes, some real hard truths, eh? Niche, humility, willing to learn, and just pivot. For me, it's always pivot. Pivot. Yes, yes. If somebody told me I'd be talking tech on a podcast five years ago, I'd probably laugh at them.
SPEAKER_01:But yeah, yeah, and that's not your willingness to learn, right?
SPEAKER_02:Yeah, yeah, yeah. Yeah, I've I've I've learned today to be more diplomatic, less conspiracist. Take time, the need for dialogue.
SPEAKER_01:Yes, dialogue is always key.
SPEAKER_02:And ensure you have so there's always a saying, right? When you're in Kenya, when you move to Kenya, you always need your main plugs. A medical plug, a legal plug, and if you're a parent, you need to find out where you're gonna take your kids, right? But now through this, you need a legal tech plug.
SPEAKER_03:In that.
SPEAKER_02:It's not just a legal plug. No, no, no, no, no, no. Legal tech. It's a legal tech plug who explain, okay. I've I've really loved our conversation. That's that's been brilliant.
SPEAKER_00:It's been a pleasure. Time has flown by. Time has flown by. We've talked to another hour or something.
SPEAKER_02:I know, I know that's that's why that's how we can always have a part.
SPEAKER_00:Yeah, it has been fun.
SPEAKER_02:Yeah, in fact, when you do launch the policy, and I know, and I pray that you're part of that team, you can come and join us and tell us about the process and what you hope to see as you know, AU ambassador of legal and tech. I can see that. Please take me along with you. Thank you.
SPEAKER_00:I will not forget you, guys. You'll not forget.
SPEAKER_02:Asante, asante, right? Thank you. Thanks so much. You're welcome. That's it from us over here at Tech Talk Africa. You've heard from the brilliant legal tech mind Richard Odongo. My name is Stella Kishury. If you enjoy this episode, please subscribe and share it with someone curious about how Africa's digital laws are being written in real time. I'm your host, once again, Stella Kishuri, and this has been Tech Talk Africa. Until next time, stay humble and willing to learn.