EDGE AI POD
Discover the cutting-edge world of energy-efficient machine learning, edge AI, hardware accelerators, software algorithms, and real-world use cases with this podcast feed from all things in the world's largest EDGE AI community.
These are shows like EDGE AI Talks, EDGE AI Blueprints as well as EDGE AI FOUNDATION event talks on a range of research, product and business topics.
Join us to stay informed and inspired!
EDGE AI POD
Survey Data Shows How AI Will Reshape Cars And Why It Belongs On The Edge
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
We share new data showing why drivers see generative AI as a defining force in mobility and how edge inference makes cars faster, safer, and more personal. We map the use cases, hardware shifts, and the move to software-first procurement with clear guidance for builders.
• survey highlights on generative AI as a mobility megatrend
• definitions and examples of circular economy in vehicles
• priority edge use cases in ADAS, safety, and infotainment
• hidden value in predictive maintenance and intrusion detection
• why inference runs on the edge for latency and reliability
• constraints around cost, memory, and over-the-air updates
• NPU rise over GPU and evolving CPU roles
• software-first buying and model portability trade-offs
• smarter sensors, radar AI, and neuromorphic paths
• hybrid architectures for sensor fusion and efficiency
Learn more about the EDGE AI FOUNDATION - edgeaifoundation.org
Setting The Stage And Survey Scope
SPEAKER_01Martin, welcome. Good morning, good afternoon, depending where you are.
SPEAKER_00Yes. Or good night if you're uh if you're in India, yes, or Japan. I think there's someone from Malaysia on the call, so we'll see if that's uh but yeah, great to great to talk to you again. I know we spoke a couple of was that a couple weeks ago, a week or two ago. I think I was in Korea at the time, speaking of time zones. Was I in Korea? I think so.
SPEAKER_01Yeah, I think you've been in Korea because it was for me a time not matching to the US time zone.
Gen AI Emerges As Mobility Megatrend
SPEAKER_00Yeah, no, yeah, it's always challenging, but uh, but now I'm back and I'm over my jet lag, so that's good. Uh, but yeah, good to see you. So we're gonna talk a little bit about um well, we use the automotive market as kind of a proxy, and what I think was really interesting when we talked was that um, and we'll bring up some data, is that you took sort of a customer voice of the customer perspective on generative edge AI and generative AI scenarios for automotive. Um, and you said you you had um sampled around like 25,000 end users, was that it?
SPEAKER_01Yeah, that's we are doing a global consumer survey launching in Europe, US, Japan, China every year. I think it's 20,000 participants, and we ask about everything around mobility, and we have also asked, and I think if you pull up the slide, you can see it about uh Gen AI, and uh 46% of the people we have asked see that as a key mega trend in mobility. And I think the this text in small is also important with a potential to create valuable new use cases and change the mobility sector. So half of the people we have asked, and it was already in 2024, see that there's an extreme potential of Gen AI, similar to circular uh economics and even more important than carbon neutral mobility.
What Circular Economy Means For Cars
SPEAKER_00Interesting, yeah. I see that. I see that. Um, what is circular economy, by the way? What can you explain what that label means?
SPEAKER_01Yeah, so it's like really um reusing, extending the life of the vehicle, but also mainly reusing the the parts of the vehicle and into a second life could be, for example, the best example is a a battery which you then use not just in an electric vehicle, but then as a um stationary energy storage after that. Oh, I see. Okay, yeah, right, right after the first line.
Top Edge AI Use Cases In Vehicles
SPEAKER_00So this is about 20,000 end users uh across many different regions, right? It wasn't just Europe. You had US in there as well. Yeah, China, Japan, Europe, US. Wow, wow. So 20,000 is a pretty good survey. So it's interesting. So you you positioned a bunch of different um things toward them, and so they picked Gen AI uh as as kind of the the key thing for for creating new use cases, which is interesting. I've never seen that before, but can you give us a little more insight as to what does that mean? Because Gen AI is a term that's interestingly kind of made it into the public sphere. Um, and so people have been using chatbots and things like that. But when they think about it for cars, I'm curious, like what are we doing?
SPEAKER_01Yeah, we have I as mentioned, I work in the intersection of automotive AI and and software, and you basically did a small survey, so that's not the consumer survey. This is really we have asked people in the industry where do we have at-the-edge AI? So in inference on the car, what does it now change um in the vehicle in mobility? And 50% of the participants said the main use case is in uh advanced driver assistance systems and autonomous driving, where you can now shift from a rule-based algorithms to like end-to-end models, which include perception, motion planning, and partially also control and really getting um Aiders to a really new customer experience level. Or the second use case in that is in cabin safety. 29% mentioned uh infotainment and confort features. I think that's most known to all of us, that you can also have voice assistant personalization in cars. But there are also a couple of other use cases, maybe less known, but also very important, especially from the OEM perspective, that you do, for example, network health analysis, intrusion detection, keep the cars safe from intrusion, but also predictive maintenance, detecting the um potential errors much, much earlier, and then saving cost in the aftermarket and on on warranty and repairs. So many use cases really changing, not just what is obvious, adding voice control, but really um adding something on ADAS, but also safety, security, predictability of errors, many, many use cases, which is why everyone in the automotive industry is really excited about that.
SPEAKER_00Yeah, for sure. And I can see like the top two are kind of more human machine interface type of scenarios, and the stuff below, I think a lot of folks working in edge AI are familiar with in terms of anomaly detection and other types of you know, um uh mechanical health analysis, ambient noise cancellation. I mean, we're seeing that happening now, like with AirPods and hearables um and using audio AI for that. Um in fact, we've had some talks on on this live stream, previous live streams, talking about kind of uh motor, motor optimization, I would say in drones, like how do you optimize drones for extended battery life by using uh edge AI and and tiny ML and uh in those motors. I can see all the stuff below is maybe not as uh visible to the end user, but would be also pretty valuable in terms of getting more out of the um out of the equipment in general. Um so it's good to see that people have that at least top of mind. And then, you know, I think the top stuff, like you said, um you know, the voice assistance and the personal recommendations, I mean, that seems pretty straightforward. I think that um um it'll be interesting to see how those actually get deployed. Um, what what are what's what's the other inside stuff that you got out of this thing?
SPEAKER_01Yeah, so if you say deployed, I think the important thing is like end-to-end models for ADAS, they're already in the field. So if you drive a Tesla, I think after version 12, you're using basically end-to-end models. Also, if you look at different OEMs, they have already voice assistance using Chen AI. So a lot of lot of stuff already on the road and not just like uh future stuff.
Real-World Deployments On The Road
SPEAKER_00Mm-hmm. Mm-hmm. Yeah, no, for sure. And uh yeah, every time I'm in San Francisco, I take the Waymo's there. So I make sure I get my uh my get my my uh self-driving car fix. Um yeah, so that's good to know. Good to know there's demand out there, and like you said, there's been some deployments. Um, and I guess the the what the survey is saying, the guidance should be really around some of these top scenarios and making sure that automotive makers nail those. I guess there's also be a proxy for just in general customers are starting to get more educated about what generative edge AI or generative AI can do for them in terms of their equipment and experiences. So that's good to know as well. Um, that's cool. What else? What else did you get learned from the survey?
Why Inference Belongs On The Edge
SPEAKER_01I think what we also learned. So we we are talking about automotive. There are some similarities, but also some specific things, which might be also true for other, let's say, embedded devices, where we are not talking about phones or computers. But what is um what is the main reason to deploy inference on the edge? Why deploy the the large language models on the edge and not in the cloud? And I think here, based on the survey, there are four four main reasons. The first one is offline availability. Just think about your ADA system, it still needs to run if there's a loss of connectivity if you go through a tunnel. So this is really in a lot of these use cases I've mentioned before, you also need to have them while you're not connected to the internet. So that's one of the main reasons to deploy on the edge. There are some use cases, like for example, noise cancellation or also ADA systems where you cannot wait to transfer the data to the cloud and wait for the feedback. Um you that that's just not safe. Then it's improved data and security and privacy, but that's let's say a less important reason that the data is only processed locally and not uploaded into a cloud. And then also for some use cases, if you would, for example, need to transfer full images, then it's also a matter of decreased networking traffic by by running the inference on the cloud.
SPEAKER_00Yeah. And this kind of maps to the kind of general value prop of edge AI in general. Like we have the there's an acronym I use called Blurp, B L E R P bandwidth, latency, economics, reliability, and privacy. Um and it sounds like this kind of hits all of them. Although in this case, it's kind of staff stack ranked for I would say reliability, offline availability first. Um, so that kind of that kind of tracks, that kind of tracks, which makes sense. So what what uh what else do you got on this thing that's uh helping?
Hardware Limits And Cost Pressures
SPEAKER_01Yeah, maybe I now just had all the good things, but there are also some challenges. Ah and some of them I would say specific to um to vehicles, others less. But same survey where we asked what are the four main challenges, and there are two hardware, two software challenges. I think on the hardware side it's still SOC resource constraints. I mean, if we are on the edge, we have not unlimited power, um, it's computational power, but it's also especially like where do I store the model so and on the execution, the hard the data throughput, the high bandwidth memory are a couple of things which were mentioned a couple of times that that's currently like a constraint, and the constraint in two things like the first one is the availability of this high performance chip in automotive gray quality, but then also the the costs, because uh of course, um if you're talking about an ADA system, it should not increase the purchase price of a car significantly. So, this is why cost always matters for these kind of end customer vehicles. Second is energy consumption, so um, especially like for um high energy consumptions for SOC during inference, and two other things more like on the software side, the updatability of models is difficult given the large size and the way over-the-air updates are done. And then the last one it's only 4%, but still want to mention it that it's still missing good frameworks for um for the ops part of machine learning and um, for example, updating the models in an automotive context.
SPEAKER_00Yeah, yeah, no, I could see that again. This kind of tracks to a lot of uh, like you said, general edge AI challenges. And um, although I would say automakers are notorious for not understanding how to update their software, but I think Tesla and some of the newer folks have figured that out. But yeah, I could see that um the main thing, and the good news is I think from year to year we've seen a tremendous increase in the kind of ops per watt or performance uh capabilities of some of the SOCs and platforms, especially with the kind of outboard um accelerators that are coming to market being used with CPUs too. So that's a good thing. Uh, but this seems to track pretty pretty uh clearly. I think the last one, software frameworks or deployment, yeah, it kind of just makes it difficult for device builders uh and solution providers. Um so especially if you want to move models from one platform to the other. But uh there are solutions for it, it's just a little bit uh a little bit tricky sometimes. But um, yeah, this seems to track pretty well.
SPEAKER_01And sometimes if there are solutions, it just takes a little bit longer until they end up in the automotive field, given long development timelines, right? Especially if you're talking about safety critical devices and functions.
SPEAKER_00Yeah, no, I can imagine like kind of the automotive market, automotive markets kind of at the bit of at the tail end in terms of deployments because obviously things need to be uh certified safe and um pretty bulletproof as they go out to vehicles. So yeah, that kind of makes sense. I saw in your footnote there 28 gig for a 7 billion parameter model, and that's assuming four bytes per parameter. So um that's an interesting data point. I had seen data that was more around a gig per billion parameter, but maybe that's with two byte. Um but that's an interesting data point for the audience to think about uh typically the generative edge AI um uh deployments are somewhat memory bound more than tops bound. And so uh the cost of memory into these edge devices is one of the primary issues right now. So as the models get more efficient and smaller and more quantized, um, that'll actually bring down the memory costs a little bit. Um that'll help with kind of point one, I think. Uh but yeah, that looks pretty cool.
SPEAKER_01Maybe I think what what we also did like this was not like in the um in the survey, but based on interviews, how it is how the the chips are changing, and we see that for for different use cases, CPUs remain important, especially like in out in the automotive ADAS context, safety critical, being the orchestration layer for the full SOC, managing deterministic um workloads. Uh we see that general uh GPUs are increasingly outpaced by specialized X uh by NPUs and more specialized for the tasks. NPUs are getting more and more important, especially if we are talking about the ADAS domain. DSPs also lose a little bit the importance going forward.
CPU, GPU, NPU: Roles Are Shifting
SPEAKER_00Yeah, that makes sense. I know that also like kind of the first gen NPUs, the outboard NPUs, tended to be more computer vision oriented, um, which is a different architecture than kind of transformer-based models, which is generative edge AI or generative AI oriented. So now we're seeing more NPUs coming out that are designed from the ground up to be more of an accelerant for transformer-based model. So uh there's a world out there of folks that with wit outboard NPUs, um, but uh one thing to keep in mind is that they're now transitioning to be more tuned for generative AI models as opposed to vision AI models. Um, so that's interesting to see that. And yeah, I agree. I mean, the general purpose GPU, especially DSP, yeah, valuable for things, but I think for accelerating gen edge AI models or gen AI models, these new MPUs, outboard ones, uh are kind of where it's at. And then eventually those will migrate into more kind of integrated SOC solutions, right? Uh, but uh that seems to be kind of the hot space. So I think what you're saying is for folks designing these systems, um, probably the MPU is one of the most critical components to have in there, right?
SPEAKER_01Yeah, and taking uh increasingly share of the full uh SOC, like if you just look at the space.
SPEAKER_00Makes sense, makes sense for sure. Good. Uh let's see. I'm seeing if there's any questions so far, not yet. I think everyone's still getting their coffee and uh waking up a little bit. Uh what else? Anything else on this one?
SPEAKER_01Um, I think like what what's important or what's the implication that NPUs are getting more important that NPUs are usually very much tied to a specific model, that OEMs now start not selecting the hardware and the software follows. They're now turning it around, first selecting the software and then choosing from compatible hardware, because like the the effort to port a model to a new NPU is just too high for them, so that they try to avoid it really only select the ADAS system and software supplier first, and then just select one of the compatible hardware and chips.
SPEAKER_00Right, right. I see. So, yeah, figuring out sort of what you're going back to almost like the first slide, what are the kind of critical scenarios you want to execute, and then getting the hardware that matches that makes sense. Um, so maybe we're shifting to more of a software-first um model here. I'm looking at a question here from Jim. Uh Jim, we haven't talked about power consumption here. Less important for automotive, but other spaces tend to be yes, of course. I think um, yeah, in the automotive uh scenario, you know, they're not they are running on battery, but not that kind of battery. Um, but yeah, I would say overall, like in terms of I think probably everything that goes into an EV these days is measured on weight and power consumption, right? To extend battery life. So in as much as you know, we're getting more power-efficient SOCs and NPUs, I guess that's going to be a competitive advantage.
Software-First Procurement Strategy
SPEAKER_01Yeah, power consumption was basically selected number two pain point of edge AI in automotive. Um, I think there are typical dimensions. I think it's not just the energy energy consumption, batteries are big. There are other, let's say, domains where this is a bigger of a challenge, but I think also high energy consumptions correlates with high cooling needs, potentially leads to water cooling, and that's all costs you want to avoid in the consumer.
Power, Cooling, And Efficiency
SPEAKER_00You want to water cool your NPU in your car, that would be maybe a McLaren or something. Um you know, the other thing we haven't talked about too. Actually, I'll bring Marcel Marcelino's question up here. I noticed the other day, not the other day, a couple weeks ago, Mercedes announced the iconic uh um, I think it was it was like not yet bringed to production, but it was more of a, I don't know what they call it, a show car or whatever. But they had mentioned in there, so if people look up the Mercedes Iconic, which looked beautiful, um, they talked about using neuromorphic computing. So in the sensors themselves in cars, which traditionally have been dumb, uh, the sensors themselves now are going to be running AI workloads. So they were talking about using neuromorphic computing and some of the sensors of their car, so that you're not running every AI workload on this you know magical NPU, um, that you're actually doing, you know, even edge AI, there's multiple edges in the audit in the car moving forward, right? So you could you know use uh use that to be much more power efficient because neuromorphic computing tends to be super power efficient relative to sort of traditional architectures. But do you have did that did the survey pull that out at all, or is that more of an No?
SPEAKER_01I think we I think we did the survey a year ago, so I think this uh Mercedes was not there yet, but I think it's it's a little bit going back and forth. I think we started with smart sensors, like early ADAS uh sensors were a camera with a compute in it doing everything. Then with more advanced driver systems where you need to do like fusion of different sensors, the intelligence went to a more central ozonal computer. And I think now we're getting into a hybrid world where some sensors might get AI. I've also seen it that radar's getting AI to improve the detection. And it's also a matter of like bandwidth, especially radar. Raw data is pretty big, so having some AI on a radar and then only transferring objects might make. And so I think in the future we will see both. I mean still fusion on a central compute, but even smarter sensors.
Closing Thanks And Next Segment
Smarter Sensors And Neuromorphic Paths
SPEAKER_00Yeah, exactly. So even within the the vehicle itself, there'll be several levels of edge AI um running in concert, right? For efficiency. Yeah, very cool. Well, this has been super helpful. Is there's any other questions? I think we're going to shift back to Danilo. We can bring him back on board here. Yeah, thanks for having me. Oh, yeah. Thank you so much, Martin. This is really interesting. It's always good to bring the voice of the customer into the conversation. And uh thank you to McKinsey for giving us your time.