What's Up with Tech?

5G To Autonomous: Making Networks Think

Evan Kirstel

Interested in being a guest? Email us at admin@evankirstel.com

What if your network could understand intent, adapt in real time, and deliver the exact experience you need—whether you’re at a packed concert, inside an ambulance, or running a factory floor? We dive into the shift from manual engineering to autonomous networks and how open standards are making AI practical, measurable, and global.

Claudia Muñiz, Global Head of Sales for Cognitive Network Solutions at Ericsson, explains why 5G’s real promise is differentiated connectivity and how the Service Management and Orchestration (SMO) layer unlocks it. We trace the journey from early self-organizing networks to an open RAN ecosystem where developers ship RAN apps once and deploy everywhere. Expect clear examples of digital twins and multi-objective reinforcement learning tuning antennas across massive configuration spaces, with live results: higher throughput, faster speeds, fewer dropped calls, and better spectral efficiency—proof that AI at the radio layer moves the needle when paired with the right data and processes.

We also separate hype from reality on large language models in telco. Lightweight, energy-aware models handle millisecond decisions at the edge, while LLMs shine in cloud workflows, documentation, coding, and natural language operations. With partnerships that blend domain expertise and elastic compute, engineers can ask plain-English questions and get guided actions, all built on a unified, secure data layer. The conversation lands on what matters most: time-to-first value, measurable gains in churn, revenue, energy, and effort; and the near future of intent-based networking that coordinates specialized agents to meet business goals across open interfaces.

If you care about 5G, open RAN, AI-driven operations, developer ecosystems, and the real-world path to autonomous networks, this is your roadmap. Subscribe, share with a colleague who lives in dashboards, and leave a review with the one question you want us to tackle next.

Everyday AI: Your daily guide to grown with Generative AI
Can't keep up with AI? We've got you. Everyday AI helps you keep up and get ahead.

Listen on: Apple Podcasts   Spotify

Support the show

More at https://linktr.ee/EvanKirstel

SPEAKER_00:

Hey everyone, fascinating discussion today as we talk about how open mobile networks are revolutionizing the way we unleash AI with Ericsson, the real innovator and leader in this space. Claudia, how are you?

SPEAKER_01:

Thank you, Evan. I'm great.

SPEAKER_00:

Great to have you here. Of course, everyone's sort of Ericsson, but maybe introduce yourself and your team within the company.

SPEAKER_01:

So I'm Claudia Muñiz and I'm Global Head of Sales for Cognitive Network Solutions. And in this unit, we take care of the optimization services of the company to make the best networks even better. And also we develop AI models to take care of the optimization and the healing that we commercialize as RAPs.

SPEAKER_00:

Fantastic. So we have lots to talk about. But first, um, for those who maybe aren't in telecom care or network automation, this is often viewed as a kind of back office uh space. But why should they care? What's the everyday impact uh to both users and enterprises?

SPEAKER_01:

Well, I think the first thing to ask is like, does anybody here have a phone? Have you heard of that? And do you remember the day that your phone logo went from 4G to 5G? Did you feel anything different? Maybe not much, right? And the reason for that is that the true power of 5G is to unlock differentiated connectivity. And these services are being launched actually right now. So think about, for instance, you travel to Singapore for Formula One or to one of the Taylor Swift concerts, or you go to the US for FIFA. Wouldn't you want to make sure that amid the crowds your phone is gonna work? So you can watch replays, you can find restaurants, or you can find your family that maybe go dust on the way to rings. So that's for us, the subscribers. But now think about connecting the police, connecting an ambulance that is carrying somebody sick to the hospital and is sending telemetry about his health condition. So all this is a lot of different requirements, different slices that need to serve different users with a specific speed and latency. So as far as standalone enables this, and these are like new services that can be offered to the subscribers, there is a lot of complexity that comes along with that. So that's actually where autonomous networks come in. So the differentiated connectivity are the services that the subscribers see, and then the autonomous network is all the automation underpinning it that can manage all this complexity without the operational cost skyrocketing to make it work. So it's the same type of revolution that we've seen with Uber or Airbnb. Those were new services launched to the users that were powered by AI insight that nobody sees but is there and is very critical and very cool.

SPEAKER_00:

Yeah, it's an amazing time. This idea of autonomous networks coming to the forefront. Are we talking about a purely driven uh network with AI only, uh without humans?

SPEAKER_01:

I mean like a dark factory. I think maybe one day, but not today. So, I mean in telco we have a standard that is by TM Forum. So it's called the Autonomous Network Standard. And I really like it as a framework to measure automation and maturity. So, first of all, from all the processes that we have in a telecom network that are many, no, so provisioning the network, optimizing the network, managing faults, and so on, we have we break it down into that, and then for each of these processes, we establish a maturity that is five levels. So, level five autonomy is exactly what you were describing: this idea of no human, no light, dark, knock, as we call it. And then level three is the point where you are getting insights and recommendations automated by the system, but the ultimate decision on what to do is still taken by a human. So, Ivan, I have a question for you. Where do you think the telecom network is on average in terms of maturity? Where one is you know, everything is manual, and five is everything fully autonomous?

SPEAKER_00:

Maybe four, I'm guessing.

SPEAKER_01:

Actually, we are around two, two and a half today. Wow. So yeah, because I mean it's very difficult to reach that level of autonomy across all the processes, right? There are some processes that are very automated, but there are others that are still largely manual. So that's why I was saying before, maybe one day, but not today and not tomorrow.

SPEAKER_00:

Fascinating. And we've had automation in uh network and networking for years, but what's changed this time around? What's new? Why is this moving the needle so much more?

SPEAKER_01:

Yeah, so let me do a little bit of history because I think understanding the trajectory always helps us understand better why and when things happen. So if we take it back to 2009, uh we had the 3 EPP, which is like the telco standard for the for the radio technologies. When you see like 2G, 3G, 4G, they are standardized by 3EPP. So we have release eight. And this was the first version of the standard where we have 4G and where we had SON. So SON stands for self-organizing networks. So it was the first time we were standardizing centralized automation use cases to automate the configuration, the optimization, so a network that is already operating to make it better and the healing. So if there is a problem to fix the problem automatically. So then the industry started adopting it. And what I think is really interesting to observe is that even though it was launched almost at the same time as the iPhone, it didn't have this concept of to have like a global app store where we could share automation use cases from everybody. So then all the CSPs started adopting some of the use cases. Many of these some platforms came with an SDK, so they developed a lot on their own. But then it became like Little Island, where everybody had a little thing going on, but there was not this global scale of innovation. So then around 2019 is when the first 5G networks started to be introduced, and you could see there was a wave in the market about AI. So, in order to get you know, a CTO meeting, you had to bring some kind of software or something that had AI. That was also that around the time we started to have some solid solutions. And many CSPs launched AI, you know, center of excellence explorations and so on. So they had some mixed access with piecemeal AI use cases that fixed some problems here and there, but failed to deliver value at scale. So then 5G was growing in traffic and growing in footprint and so on, and now you had like these five layers of complexity that had 2G, 3G, 4G, 5G. And then the industry was reaching to a breaking point because you had the operator typical landscape was a lot of tools collecting the same data for different purposes. So we have one to look at the coverage, another one to optimize interference, another one. So then it was a lot of duplication and silo work and very ineffective environment for the engineers. So about that time, the open run standard came in. And then it was very interesting. It had like a management layer, which was the SMO that stands for service management and orchestration. That had this idea of unified data layer. So we collect all data at once, and then the use cases are apps, and there is actually a standard interface between the platform and these apps, meaning that we can have global innovation and a global ecosystem of this kind of skill, which is really, really interesting. So if I reflect a little bit on the number of RFIs, RFPs that we have received over the last two years, the announcements that we have to date, how much the companies on the ecosystem creating RAPs are growing and so on. I would say that as an industry, we're going there, right? So the SMO is becoming the standard for network automation, both in the legacy networks that we have and in the open RAM networks. Some customers actually they like it so much that they decide to develop one of their own. Then they struggle because to ingest uh telco data is not really trivial, right? But well, the reality is that we're going there and this is really the starting point to bring AI into the network. Now, what needs to happen, like it happens with every time that you invest in a new product portfolio, is that we need to go and identify which is like the main pain point that's this icebreaker use case for this particular operator that we can fix. We can show value together in the field so that they can then commit further capital, right? Because as you know, the telco industry right now is not having like a great growth. So we need to be very mindful and very crisp on what is the value that we provide with each and with each and every investment.

SPEAKER_00:

Fantastic. And it's opening up the network to uh businesses, enterprises, developers, and beyond. You know, what are some of the tangible benefits that they'll be seeing? And do you have any examples?

SPEAKER_01:

So I think for anyone who wants to enter the this this environment, I think it's very interesting in the sense that because it is powered by a global open standard, you can develop an app once and you can sell it everywhere. So that breaks the traditional model where you have like maybe smaller companies working with one operator or two. So now they can develop once and they can sell everywhere. The other thing that is really, really I think is very, very cool is the developer portal. So in Ericsson, we have our own version of the of the non-real-time rig where the Arabs run, which is the Ericsson Intelligent Controller. And it comes with a developer portal. And basically, it offers anything and everything a developer will need. So you have Arab templates, you have trainings, you have examples, you have sample data, you have a forum, and all this is connected by a GNAI coding body. So you can directly ask questions, and of course, you'll get answers based on that. And this is really helping the Arab development to be you know 50% faster. It's for free. So any of the partners can access it and start playing with it. Actually, as of last week, we had 84 members of our ecosystem that we publish in the Arab directory that you can find online. This number is in even weak, so it's probably now updated. And then the last thing that I want to also make clear is that we in our team that we develop AI RAPS, we use exactly the same assets for development. So there is no hidden SDK that only Ericsson can access, because we are also selling our Arabs in an open ecosystem, so we need to make sure that we do it in a standard way. So I think that's something very interesting, and something that if you are considering funding a company, maybe you could get started from.

SPEAKER_00:

Super exciting. It also lets CSPs do things that they couldn't do before. Give us the peek behind the curtain as to how that works.

SPEAKER_01:

Yeah, so I mean just to level set, those of maybe not so familiar with telecom, right? So if you look at the telecom network from the sky, it looks like a honeycomb, right? So you have all these hexagons, and in hexagon is one cell, and each cell is radiated by one antenna. That's why it's a mobile network. You're moving from one hexagon to the other, so from one antenna to the other, and sometimes your call drops if that movement fails, right? So with 5G, these antennas they get many, many more transmitters each. So with that, we allow to have a much more accurate shaping of the how the signals are transmitted in the air, both on the horizontal and on the vertical uh angles, right? So we can control much more how we are transmitting, therefore delivering an overall better performance in the cluster. So if you take a cluster of 500 sites in total to select the optimal cell shape for each of these antennas, considering multiple frequency layers that we might have in one in one advanced network, for example, in the US, you will have to be able to compute the equivalent of 96 to the power of 6,000 possibilities. So this is something unmanageable for the human brain, unmanageable for our based algorithm. So that's why in our AI space paper Arab, we have a combination of digital twin technology and reinforcement learning. So with the digital twin, what we do is that we model very accurately what is the traffic pattern, both on the horizontal and on the vertical axis. And then with the multi-objective reinforcement learning, we are able to set different priorities in terms of what is the intent with this optimization. So I want to achieve this minimum level of coverage while balancing the traffic between layers, while making sure that the speed peaks in this layer and so on. So we can provide different priorities. And then the reinforcement learning algorithm will converge towards the optimum setting for that cluster, right? So each site will have a different setting, and then overall it will perform better. So when we have used this algorithm in the US, we manage to increase the traffic volume carried by that area by 10%, also increase the speed by 10%, lower the voice drop call rate over 5G by 8%, and then overall improve the spectral efficiency, which is a measure of how good that network is. So how much data traffic we get per earth of spectrum, right? So I think this is something that was definitely not possible before having this unified data layer to materialize AI scale.

SPEAKER_00:

Amazing. Wow, the data is really fascinating. Can you share any other real-world examples where AI and RAN automation is making a difference, maybe in a surprising way?

SPEAKER_01:

Yeah, so for that, I mean we have in the same way that we have these AIS antennas that are very futuristic and so on, we have the traditional passive antennas that you can only tilt. So you can move them, so to say, up and down, and then you can transmit farther or you can transmit more closer, right? So this is a very old type of automation that you have solved with sun, you have solved with um you know propagation prediction. So we also have a model to do that. And then when we run the trial in the US, we improve the spectral efficiency, we improve the speed, so we outperformed the traditional method. But what was really interesting came when we sat down with the operator and the open up how they work today and how much time they spend on each task associated with this TLTNA. And then we came out to the conclusion together that we can save 75% of the time that they are spending today, because a lot of tasks are not needed, right? And I think that's something that is very, very important to remember when we are talking about AI, right? There is the algorithms that we always talk about, and they're very cool, and we love them, but there is the element that to fully unlock the value of AI, we need to transform the processes, we need to stop doing things, and we need to get the people to trust the AI. So that's why we infuse all of our AI apps with explainability so that the engineers can look into the recommendations from the algorithm and they can say, okay, the recommendation is provided because of this, this, and that. So, in a way, we are opening the AI black box, then they go and implement them in the network, then they see, okay, actually, this improved. So, therefore, little by little, you know, after a while they can trust and then we can fully let go. So, there is a lot of work to do on researching and implementing these algorithms, not as if it is on transforming the process and stop doing things so we can create deficiencies and we can start working on some more interesting uh areas that are coming up.

SPEAKER_00:

Fantastic. And speaking of interesting, AI is just a fascinating arena these days, so exciting, and yet so many platforms, so many providers, different foundation models, different technologies. Um, from a telco telecom perspective, how do you make sense of all this and all these choices and different unique approaches?

SPEAKER_01:

So I think those of us who have been in telco for a while have probably witnessed hyperscalers launching telco data products and then discontinuing the year after, right? And some might wonder why is this right? I mean, the reality is that to manage telco data at scale in a secure way is non-trivial, especially across the legacy networks that we have. So the northbound interfaces that are generating data by the second, they are largely proprietary. The amount of agreements to share this data and so on is somewhat uneven. And then the LCM becomes very complicated because there are new features, new technologies, new counters, and so on coming every month. So we expect things to get better once the open run standards are adopted at scale, but it will take a while. Then the other element that I think some people might miss, no, looking at the all-milely hyperscale that knows it all, is that when you go down in the network autonomy layer, so you have things like you know, billing or complaints and so on that are maybe more like genetic across industries, you might argue. But when it comes to the resource operations and especially the radio interface, I mean, we're talking about electromagnetic waves and the propagation is ruled by physics, right? So, in order to identify which information elements do I need to select to solve this particular problem in a scalable fashion, you need to really know about the domain, you really need to know about radio. So I think that's something very unique that we take a great pride of in the unit where I work with in erection cognitive network solutions, is that our heart is really to be optimization engineers. So we know what it takes to solve certain problems, to solve certain trade-offs in the network. And we infuse this knowledge in the in the AI models that we develop over years, right? And we've been very long on this. So we've been before 11. So we started, you can still find the press releases provided. You started with the softbank in Japan back in 2018, and then we had one of our first large-scale public contracts with Entity Docum also in Japan in 2020. So some of the algorithms we were discussing today, I mean, they are the product of years of work and tens of networks that we have, you know, work in cooperation with customers, use the data to retrain and so on and so forth. So it's not really trivial to do something that is truly transformative that moves the needle, at least in telecommunications on the resource operation layer.

SPEAKER_00:

Amazing progress. Wow. So let's talk large language models. Uh, we're all using them every day, very practical, very uh exciting. Uh chat GPT, Gemini, on and on and on. Uh, but what is the role of an LLM in a telco domain and and why is is telecom different?

SPEAKER_01:

Yeah, I mean, this is such a hot topic, and there is so many misconceptions that actually last week, believe it or not, I made this slide like AI does not equal LLM and AI compute does not equal GPU, right? Because you see, you know, last year we had an anecdote that we were discussing with some executives who had visited Silicon Valley, and they were thinking, okay, you know, we will use this kind of productivity tool powered by LLM. I'm sure all of you know what I'm talking about. I will not name names. And it's just gonna fix all of our problems. So we're gonna do a pilot and it will fix all our problems, and we will ask it where do I need to put the sites to get maximum revenue, and it will tell me, and my life will be very easy. And I say, okay, hopefully one day, but definitely not yet today. So the LLMs, as you were saying, Ivan, I mean, they can be our psychologist or they can be a very powerful tool. So even if we develop our own ML models in a proprietary way, we do a lot of AI research and so on. It is true that in these Arabs, we have like, if you look at it like a car, we have like the engine that is the ML model, and we have like the chat sit, you know, the UI, the data connectors, and so on. These are things that we are automating. And actually, our head of Farandis, very strong on that. She always says if you're going something, if you're gonna do something twice, you better automate it. So that's why we we are already using GNAI coding bodies with very good results, right? In those elements that are more like replicable. Now, when it comes to LLMs in telco, I think we need to establish the trade-off between the model size, the speed of reaction, and also the energy consumption. So, like we established at the beginning, the telecom network today is a very distributed system. So we had all these hexagons, right? Remember the honeycombs, and we have one site on each of them. So each of these sites needs to make decisions that are fully autonomous on a millisecond level, right? So we have features like AI native drink adaptation, where we have a proprietary model that is like a neural network of about 15,000 parameters. It consumes about 100 watts and still is able to improve the spectral efficiency by 20%. Now, compare that with a KBPT query that takes seconds and spends kilowatts. It's not really fit for purpose, right? But then at the same time, I mean, we are not closing our eyes because all this is moving and it's moving so fast. So things might change in the future, right? So we are actively engaged in AI in run and in the 6G explorations, and we do see potential once the energy consumption and the performance come on par to today's network. Now, if we go from the distributed millisecond level that we were discussing into more decentralized automation that is running on second and above, the thing is a little bit different. So when it comes to really telco AI in public cloud, this is an area that we are actively exploring. So we have an announced partnership with AWS that I'm personally very excited about because it's about bringing the best of both worlds together and really exploring the next frontier. So we bring our domain expertise, we bring our AI models, they bring the scalability, the flexibility, not only in terms of compute capacity, but also in terms of flexible billing. So these are unlocking very, very interesting possibilities for us. And then also the possibility to change the way the engineer interacts with the technology. So today, even though we have advanced AI models and so on, we have, you know, a dashboard or a UI that the engineer interacts with. When we bring it into AWS, we can infuse it with their LLMs. So then the person, instead of interacting with the UI and clicking here and there, they can use natural language. So they can go and ask, okay, what are the worst issues in my network today, and what do you recommend me to do? So it's really, really about exploring very interesting possibilities, and you will be hearing more, hopefully soon, about overcooking. But you know, after discussing all this, I just want to bring it back to the SMO. So even though the AI is advancing in many shapes and forms, what we see is that the SMO architecture is very, very good as a unified data layer for telco because it provides this scalable and secure environment to ingest data once, right, and actuate towards the network. And also does it in a way that is enabled by open standards. So we can have this global scale of innovation, this app store for telco networks that we were discussing at the beginning.

SPEAKER_00:

Wow, so exciting. Um looking down the road, what what excites you and the team the most about autonomous networks and what what did we keep an eye on in the industry uh as a whole to make this actually happen?

SPEAKER_01:

Yeah, so I mean, those of you who know me well know that I'm a big numbers girl and I'm all about facts, right? So to me, the biggest excitement is to really go in there, deploy our SMO, there's an intelligent controller, deploy our AIR apps and materialize value in the field and measure. So work with the customers, transform their processes, and really go together and measure. How much churn did we prevent because of having better performance and delivering the experience needed at the time needed? How much more revenue we created by offering these new services? How much cost have we saved in terms of energy savings, in terms of effort savings? How much better return are they getting in their investments, in these active antennas, in their spectrum, because they are able to carry more with the same network? What can they do today that they couldn't do before? So I think that's very important that we pass this time of like time-to-first value milestone, where we prove in the field that this has equal or more value than what we expected initially. And then together we can go and explore the next frontier, which is the intent, right? So in the TM Forum value that we discussed before, right? There is like a yearly uh awards, right? The Moonshot Catalyst. So we won the best uh award for end-to-end service realization using intent-based networks. And this was a really cool cooperation with leading customers, but also external companies, where we materialize the full stack from business, breaking down this intent into service, breaking down this intent into resource actions, and having different uh providers provide different elements and everything communicating across open um interfaces as a proof that this vision can be translated into reality and it will only be brought to life by having different players contributing to different uh layers of this horizontally integrated stack, right? So, yeah, I know it sounds very futuristic and so on, but actually we are now verifying our first version of the intent management function. So this Arab acts as a supervisor between the different Arabs that are agents, right? So in my aim that you want to materialize a certain intent, so you want to save maximum energy in this network without reaching, without going below a certain coverage level, and always making sure that if there is a slice serving the police, they always need to get a certain speed and a certain latency and so on. So we are verifying that. And yeah, I hope we can come back soon with more results and more products to the mic.

SPEAKER_00:

Well, I can't wait to see it. I had no idea the speed and pace of the innovation happening on the AI front at Erickson these days. Congratulations on all the progress, but more to come, Claudia. Thanks so much. Thank you.

SPEAKER_01:

Yeah, let's make it happen.

SPEAKER_00:

All right, let's make it happen. Thanks everyone for listening, watching, and sharing this episode. And we'll talk soon. Take care, everyone.

SPEAKER_01:

Thank you. Bye bye.