Tech Unboxed

From AI to WTF – The wildest stuff happening in tech right now!

BBD Software Season 1 Episode 3

Tech is moving FAST – and we’ve got the inside scoop. In this episode, we sit down with Lucky Nkosi, Head of ATC and Stefan de Jager, software engineer to unpack what’s really going on in the world of tech and software engineering right now.
From AI hype and controversial dev tools to bold predictions and hot takes – we’re diving into it all. Tune in now and stay ahead of the curve!

Speaker 2:

Welcome to the next episode of Tech Unboxed BBD's podcast, where we dive into everything technical within the software engineering and technology space. Today we have Stefan Diache, a full-stack software engineer, and Laki Nkosi, bbd's Head of Research and Development, as well as the person responsible for all the training across the business. Thank you so much for joining us. Thank you very much.

Speaker 3:

Thank you for having us.

Speaker 2:

So we really wanted to take this opportunity to dive into the absolute chaos that is the software engineering market at the moment. There's so much innovation happening. It must be a really exciting time to be working in this space.

Speaker 1:

I mean absolutely, yes. I mean everything we've experienced in the past good couple, 10, 20 years has been just incredible, especially from cloud development to all the new stuff in technology. It's actually difficult to stay on top of things and to know what's going on everywhere.

Speaker 2:

Absolutely. I read this really fascinating report the other day that was saying that there was $151 billion worth of venture capitalist funding into the AI market in 2024. That's an insane amount of money, and that's despite a 10% decrease in overall startup funding. What do you think is driving this rate of innovation at the moment?

Speaker 1:

So it's difficult to look at how we, as businesses, can survive without adopting AI. We're stuck between sort of two groups of businesses those who are embracing AI and using it to enhance everything that they do, and those sort of left playing catch up, trying to get to the point where they should have been 20 years ago, and so AI being able to develop at such a quick rate means that it's almost silly to not throw all of our resources into it, or as many resources as we can into it, and so it's not surprising that it's such a large amount of funding that all these businesses are getting for something that's really going to change our world. It's probably the best thing since sliced bread.

Speaker 3:

I would go as far as saying FOMO is probably one of the big drivers Historically when you look at a lot of the industrial revolutions, and now we've had enough that we can actually study what happened before, during and after. And I think a lot of investors are very cognizant of the fact that we are at an inflection point and if they're not part of this new revolution then they will have missed out on probably the next 20, 50 years of really corporate dominance in our industry. You look at what tech has become. I mean, tech is even more valuable than oil if you think about it, if you think about some of the biggest organizations in the world. So I think everyone wants to be the next big thing.

Speaker 2:

Absolutely. And how is this changing your day-to-day working in the industry?

Speaker 1:

I remember when I, just before I started BBD, I read about this new incredible thing called ChatGPT and how amazing it was, and I thought that you know what I'm going to tell people at work about this. This is going to be really cool, and we're looking here now, two years later and it's almost old now because of how quickly everything else has developed. From a day-to-day perspective, it's like having a high senior developer, someone who knows a lot about almost anything, available all the time to ask questions to and actually not get upset if you ask the same question 43 times. It's meant that we're able to code a lot better and a lot faster. Being able to have all of these resources available. It feels like what my parents would describe the internet when it was first created. You have all this information available instantly.

Speaker 3:

It's quite an interesting example that Stefan gives. He says it's like having a senior. For me it actually feels like having multiple juniors that can do things that I don't really feel like doing, but, to be quite frank, it's giving me sleepless nights. I am constantly engaged in what this means for our industry. As software developers, we've for the longest time been the ones disrupting every other industry around us, and for the first time we're at the core phase of technology actually disrupting us and how we do things, and so it's massively improved our productivity. But we're constantly also worried or concerned about a lot of the risks that come with it and making sure that we understand the technology enough and we can guide it enough to actually be useful and not harmful in what we deliver and how we deliver it.

Speaker 2:

That's a really interesting point. What are some of those risks?

Speaker 3:

One of the biggest issues with LLMs the technology that powers AI at the moment is that they are very data hungry, and I think there's a stat that about over 90% of the world's code and content really in about a year or two will be generated by AI. That is a terrifying stat, because even from just basic desktop research, if you use a chat-based AI, such as ChatGPT or Croc whatever if you keep feeding it its own response, you can see it deteriorate because it just struggles with AI-generated content. So one of the biggest dangers in us using it in enterprise-grade software is that we deal with a lot of very complex and sensitive information in our code bases, and the need for these LLMs, or these LLM providers, to move our data to their cloud for training makes us much more vulnerable to leaking very sensitive information, such as access keys, very proprietary and sensitive algorithms, organizational secrets, and so one of the biggest risks that we're contending with is how we engage these providers to allow us to run these bottles in a secure way that they don't need to get access to our data, and even though some of them will tell you that we don't really take your data, we just take it for analytics. We've seen a lot of very, very dangerous leaks from very large organizations across the world. So that's probably the biggest risk leaks from very large organizations across the world. So that's probably the biggest risk.

Speaker 3:

And the second risk, which is, I think, quite contested, is that the more AI-generated data or code we produce, and over a long stretch of time, people will lose and juniors in particular. They will lose that core, fundamental principles that you need to actually deal with the very complex and nuanced problems. And in five years, when those juniors are now the intermediates or the seniors, who's going to be the one digging through that code to actually try and figure out what the problem is? So these are the things that we need to contend with. So it's challenging how we think about training, how we think about the future of software and development and what actually this industry is going to look like in five to 10 years.

Speaker 2:

It's a really scary thought when you think about juniors becoming seniors and they don't have the fundamental principles. How do you negate that risk from a learning and training perspective?

Speaker 3:

I think universities have answered this question for us already.

Speaker 3:

One of the things that we've been engaging universities across South Africa and the world actually over the past decade is how they keep up with technologies.

Speaker 3:

I can tell you that what we've done from an academic perspective is to, instead of trying to keep up with the trends, actually go down right back to the basics, where we teach those fundamentals. And we sort of take the same principle here at BBD as well, where all of our juniors or grads, when they come in, what we try and make sure is we give them a very strong and deep base. For instance, we don't necessarily teach them web frameworks. We'll teach them how the web works at its core. We'll stick to the very basics of web development, because everything else that's built on top of it is really just meta information that you can learn quite quickly, and there's enough resources on the internet to help you grapple with those. And so I think what it means for learning development is, in terms of AI, everyone needs to learn about the risks, but in terms of the work that we actually do and the code we produce, it becomes even more important to teach juniors about those core fundamentals on which everything is built upon.

Speaker 2:

So you've spoken about your approach to juniors, but what are you doing with intermediates and seniors that join the business?

Speaker 3:

That's an even more interesting challenge.

Speaker 3:

I think what we're doing is, with every new technology that has come into the space, it's very important to get your very experienced and senior people firstly to test it out, because they'll be able to tell you the value that that technology brings into a workspace.

Speaker 3:

If a framework is doing something that we've already been doing quite easily and we don't need it, then we'll make a decision not to blow to our solutions, and so I think the two things that are quite important for intermediate and senior people number one is to make them leaders in the space, just to make sure that they've got a good enough grasp and command of the technology that they can advise our clients, the organization and our juniors on the dangers or the risks they are, but also the opportunities that these technologies can bring.

Speaker 3:

The second thing that I think is important is for them to increasingly become the mentors and guides that the juniors need, so that when we, as the educators, talk about the importance of fundamentals, students and juniors often think we're crazy, and so when it comes from people who are delivering software with them, the tone just lands much, much differently. So I think the two things is for them to absolutely lead in the space and command the technology. And second is for them to pass on the knowledge on those core fundamentals to the juniors that come after them.

Speaker 2:

That's fantastic insights. Thank you, Stefan. You haven't been in the industry that long. Apart from the obvious AI, what have been some of the perhaps fundamental shifts that you've seen in the industry just in the time that you've been working?

Speaker 1:

I think one of the big things that I've noticed is how so when you're studying and just getting started in the industry, there's a big talk about what's the best language or what's the best framework, or what's the best, this, best, that. One of the things that I've realized is that it's not about the best, this or the best, that there is no one-for-all solution, and it's more about what is the problems that we've seen. I've seen a big development in terms of frameworks. I mean, today I got an email about the newest, hottest JavaScript animation framework and all of these new things that are coming out means that everyone's sort of stuck on what's the best, what's the biggest, and not what's the right solution to the problem.

Speaker 2:

Sure, feels like you've been coached a little bit. No, I'm teasing. Speaking of those solutions, what do you think our clients need to be wary of, or what do you think they should be worrying about at the moment, if not worrying planning for at least if not?

Speaker 1:

worrying planning for at least. I mean, like I said earlier, it's either you can choose to sort of adopt all the changes and all the things that's happening in the markets or you can choose to, in 10, 20 years, play catch up with everything that you sort of missed. One of the big things that I would be concerned about, or would like to point caution towards, is the idea of adopting specifically technology like AI or cloud or things, without considering the implications that it could have on things like security or things like personal information and all of that stuff. I know it's very easy to say let's go all digital, let's go cloud and let's do away with how we have been doing things in the past. There's a big sort of step towards innovation and all of those types of things, but one of the problems is the business has worked a certain way for 20, 30, 40 years and all of that isn't wrong. It's done for a specific reason. So security and all of those things are quite important, but it's difficult to balance that between the rate of the view.

Speaker 3:

I think there's a lot of wisdom in patience.

Speaker 3:

One of the things that we've seen with the agile movement, with the DevOps and the DevSecOps movement, is that quite often, the big proponents of a technology are the ones that benefit the most from it, and we tend to fall into the trap of this fear of missing out, and when there's noise around technology like the cloud, we start thinking that we need to go in or else this is the end of it.

Speaker 3:

And the way that large enterprises are structured is that there's quite often a lot of wisdom in battle testing a technology before adopting it and so I think one of the things that I think organizations need to be very cognizant of is that you don't just adopt technology. You need a clear strategy. You shouldn't take on things like AI for the sake of taking on AI and saying you do AI, but it should have very clear value proposition for you and your use case, and you need to be able to measure that return on investment, because it's not cheap, and quite often what we've seen is until very recently, with very advanced LLMs like Cloud Sonnet or 3.7 Sonnet is the work it took to correct the AI was often even more cumbersome than actually doing it from scratch.

Speaker 3:

So this is all to say make sure that you battle test the tools that you want to adopt or considering adopting, make sure that you understand the value that they bring to the organization and the products that you're building, and make sure that you have a clear plan that's informed by clear business strategy. Don't just take AI for the sake of taking AI.

Speaker 2:

I was in a conversation with our CEO, kevin Staples, the other day around the use of AI. I was in a conversation with our CEO, kevin Staples, the other day around the use of AI and he was saying how important that use case is for the technology and that just because you want to use it or you feel like you should be, doesn't mean that you must. And it's so important to make sure that not only do you have a correct use case, but that you have the volume that is required for those types of services. How do you feel about what he was saying?

Speaker 3:

I think, absolutely correct, and the technology that rings in my head is blockchain. We saw Bitcoin disrupt the markets and the technology that supports it then was then to disrupt the entire tech scene, and yet for such a long time, bitcoin or blockchain has still felt like a technology that's looking for use cases. It's very useful in some contexts, but there are a lot of people that put a lot of their eggs in one basket and went full in on blockchain technology, trying to put everything else on this publicly available general ledger, and it just doesn't fit most of the enterprise use cases, and so I think Kevin was probably or Kevin was absolutely right in that this idea of missing out and this anxiety of being left behind because of what we saw in terms of trying to catch up with technology being much more difficult. You must not fall into the trap of this hyperinflation of the ideas around technology. I think what's even more risky in today's time as I say 2004, is that social media makes the noise around technology so much bigger.

Speaker 3:

Right as soon as ChatGPT came out, already, people were saying this is the last year for software developers, and I mean, having looked at a lot of enterprise code over the years, I think we still have a lot to do in maintaining these solutions, and so I think he's absolutely right Don't adopt technologies for the sake of adopting it. Make sure that they provide clear value and that your plan factors in the cost and risks that are associated with this thing. Otherwise, you'll spend millions and millions and millions of dollars in RANs and you won't have anything to show for it.

Speaker 1:

Not to sound like a broker record. It's very easy to forget, and exceptionally important to remember, that AI, much like most of the things we have access to these days, is a tool, and if you use it for what it's not intended for and if you try to sort of put it as a solution to any and every problem, you're stuck with these sort of half-baked solutions just because the tool you're using isn't the right one for the job.

Speaker 3:

I must add here, because I've been very cautionary this does not mean that you should ignore AI. It has incredible potential and you should constantly be evaluating it to see how it can add value to you and your organization.

Speaker 2:

We've spoken a lot about AI, I think unintentionally perhaps, although it is, you know, top of mind for everybody what are? Some of the sleeper changes that are happening within the industry. Maybe you know new techs, new platforms, new ideas that are coming out, maybe because of ai, maybe instead of ai, um that you're really excited about or interested to explore.

Speaker 1:

It's interesting to see how sort of the difference between what media shows as the new innovation in tech and stuff. I remember a year ago it was Rust. Rust was everything. Rust was fast and it was great and it was going to solve all of our problems and not to sound negative, but it sort of disappeared off the platforms. It's not on sound negative, but it's sort of disappeared off the platforms. It's not on the headlines anymore, it's not making any of the highest, biggest articles, and so it's difficult to see and to sort of measure what is actually causing big innovations in the tech space other than the things that were shown online.

Speaker 3:

For me, it's three areas of my interest. The first is cloud. A lot of organizations, because of the fear of missing out, rent the cloud and now they're realizing that their bill is way more than if, or the total cost of ownership is way higher than if they had it on prem because of their specific use cases or if they had re-platformed it differently. And so there's this move, or some people are considering getting off the cloud and looking at hybrid options, and so I think, where we saw this giant spike in everyone rushing to the cloud, we're now seeing this equilibrium where people are actually understanding the value proposition of cloud platforms. We are seeing a much bigger change into platform, a bigger growth in our platform engineering space, and we're seeing very calculated moves in terms of understanding how to actually platform things appropriately, balancing between sovereignty, performance, reliability and all of these different factors. So I think our platforming space in the industry is really reaching that equilibrium and there's less noise about cloud. It might be because all the noise-making people are focusing on AI, but I think from an enterprise perspective, we're sort of reaching that good spot where we've got enough experience with the cloud, we know what workloads should be in the cloud, which ones should not, and how to find that balance. That's one thing.

Speaker 3:

The second space is regulation. One of the things that I'm passionate about is accessibility and making sure that what we build is usable by as many people as possible. We're seeing regulation mainly coming out of the EU right now about making public web applications and websites accessible by law web applications and websites accessible by law, and we've done quite a lot of work in the accessibility testing automation space, and so we're seeing an uptick in terms of interest in our expertise there, and so the fact that we are once again thinking about how we make technology accessible really resonates with me, and we're seeing some innovation coming out of that space as well. And then, lastly, web. Web is always my favorite because it's so chaotic, and web developers are my favorite Ungovernable, ungoverned by anyone Absolutely love them.

Speaker 3:

Two things coming out of that space that are of interest to me. Firstly is that they've even grappling with having JavaScript and TypeScript and the flame wars between them, and I think the type annotations proposal that's proposing how JavaScript should understand types or how JavaScript should incorporate types that's making some traction through TC39, so that's quite exciting for me and then, personally, what I'm also seeing is this growing idea. So about a year or two ago, there was this massive push for server-side rendering once again, and I think everyone has realized that. The web community has gone into this loop and we've now again reached equilibrium and a lot of talk is happening about going back to the fundamentals of the web. So we're seeing a resurgence of static site generators taking their place again, and so it feels like the web even though it always feels like it's crazy it's also reaching that balanced equilibrium space Again. It might be because all the noise-creating people are focused on AI, but I think the web is advancing really, really well and it's stabilizing.

Speaker 2:

Are there any blind spots currently that you think that other industries outside of software development need to be aware of at the moment, or that they need to make sure that they are accounting for in their digital strategies, or?

Speaker 3:

that they need to make sure that they are accounting for in their digital strategies. Sure, that's a big question, because the nature of blind spots is that they're difficult to spot. I think a lot of there's been the saying that if you're not paying for a product, chances are you're the product, and we're seeing a lot of these newer tools being put in the hands of non-technical people because they are the product, and so I think the biggest blind spot that people in technology aren't very aware of is the risk of their data and their IP being lost. We currently see a number of lawsuits still pending in the US from agencies like the New York Times against OpenAI and Microsoft, and these haven't been concluded yet, and so regulation around these new technologies such as AI doesn't exist yet. There is no precedence, which means that it could swing in any way really, and so a lot of value of organizations is in their IP and they need to be able to protect their IP. That's the biggest blind spot. So get a good technology partner to help you understand these.

Speaker 2:

Subtle Final question Is there anything that you think in the next, say, five years will become more human because of what's happening within the technology space?

Speaker 3:

I think everything. What being able to process information this quickly does is it allows us to build much more bespoke experiences for people. So where we've sort of often had to cater or develop for the general majority of people, being able to process this much data and being able to do it as fast as we can now allows us to really get to know who you are, which is terrifying for me as a technologist, but as someone who builds products. It means that we can really craft beautiful, customized experiences that work for you. So you think the color yellow sucks on a webpage Guess what? Maybe we can allow you to change that and customize it yourself.

Speaker 3:

I think what's going to become more human is our interaction with pretty much everything. Historically, you needed to engage through text. We can now do it through audio with a lot more confidence than we did three years ago. Even where we needed you to put in specific English words, we can now do it in other languages much, much better. We can do it now with gestures because we can process so much of this data. So I think everything, but what's more important is the public services that we deliver to people through technology. Historically, I think early 2000s you could opt in to use technology, but we're seeing technology becoming less and less of a choice as we digitize all of our public services. So that's the space that I think, humanizing it, will become most valuable in the public space.

Speaker 2:

That's such an optimistic way to look at the future. I think there's so much doom and gloom about AI is going to take at the future. I think there's. You know there's so much doom and gloom about AI is going to take over the world. But thinking about it from that perspective, I think, is really empowering for a lot of us. Gentlemen, thank you so much for joining us today and having this conversation. Please feel free to engage with them directly on any of their social platforms if you would like to carry on this conversation.

Speaker 3:

Thank you for having us and for making me wear a shirt.

Speaker 1:

Thank you very much.

Speaker 2:

Please also take a moment to check out our tech report that is linked in this post. Have a beautiful day, thank you.

Speaker 1:

Cheers.

Speaker 2:

Bye.