EDGE AI POD
Discover the cutting-edge world of energy-efficient machine learning, edge AI, hardware accelerators, software algorithms, and real-world use cases with this podcast feed from all things in the world's largest EDGE AI community.
These are shows like EDGE AI Talks, EDGE AI Blueprints as well as EDGE AI FOUNDATION event talks on a range of research, product and business topics.
Join us to stay informed and inspired!
EDGE AI POD
Ambient Scientific's Journey: From Personal Tragedy to Ultra-Low Power AI Innovation
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
When personal tragedy strikes, some find a way to transform pain into purpose. Such is the remarkable story behind Ambient Scientific, where founder GP Singh's mission to prevent falls after losing a family member evolved into groundbreaking semiconductor technology enabling AI at the ultra-low power edge.
The journey wasn't simple. Creating chips that could run sophisticated deep learning algorithms on tiny batteries proved more challenging than building data center processors. This demanded innovation at every level – from custom instruction sets and compilers to complete software stacks. What emerged wasn't just a single-purpose chip but a programmable platform with the versatility to support diverse applications while consuming a fraction of the power of conventional solutions.
Most fascinating is what GP calls the "gravitational pull" toward edge computing. Applications initially deployed in the cloud inevitably migrate closer to where data originates – from data centers to on-premises, to desktops, to mobile devices, and ultimately to tiny wearables. This migration stems from fundamental business concerns: operating costs, data sovereignty, vendor lock-in, and the inherent distrust organizations have for cloud dependencies. The evidence? In hundreds of customer conversations, GP has yet to meet a single organization content with keeping their AI exclusively in the cloud.
Ready to explore ultra-low power AI? Ambient Scientific offers development kits accessible to anyone familiar with embedded systems programming and Python-based deep learning. Join the revolution bringing intelligence to where data is created, not where it's processed. Your next innovation might be powered by a chip that sips power while delivering remarkable AI capabilities.
Learn more about the EDGE AI FOUNDATION - edgeaifoundation.org
Introduction to Ambient Scientific
Speaker 1Sounds good. Okay, well, here we are, gp, good to meet you. I was just saying before I hit the button that we kind of missed each other because you were at Sensors Converge and I was in Italy getting ready for a Milan event. But I'm sure we'll meet in person at some point. But welcome to our Edge AI Partner podcast.
Speaker 2Pete, it's very nice to meet you. Thank you very much for having me here, and I'm very excited. It's a great organization. A lot of great companies are coming together in this, so I'm very excited to be here.
Speaker 1Yeah, great, great. And one of the things I like about the foundation community is we have such a variety of companies you know I was talking earlier about. You know some people call it the stack. I call it the sandwich, the edge AI sandwich of Silicon, ip and Silicon and firmware and OS and models, and you know data sets and models and solutions, blah, blah, blah. So there's this whole stack and we try to represent the whole stack and also from like kind of big tech to more startup-y, because this is such a nascent space. There's so much innovation happening and I think something that you guys represent for sure is like innovating in this kind of semiconductor space, about accelerated ultra-low power AI. Right, is that? Give us kind of your. I'm sure you have a well-practiced pitch, but what's the origin story behind Ambient Scientific?
Speaker 2Okay, I have discussed this in a couple of podcasts before. Ambient's starting point, pete, is actually more personal than anything else. I mean for me or part of my team, ai hardware is not new. We were working with a company that actually was building a large chip for servers, and they were later on building the entire system. This was to compete with Nvidia or other companies like Nvidia. At that time there were many aspirants, including Intel, amd and a lot of startups as well. I'm talking about 2014, 2012, 2014.
Speaker 2At that time, I had a personal accident in my family where somebody fell and it's kind of painful for me. Somebody fell and hit his head, could not survive. That was for me. That was a wake-up call Go do something about it so that at least my other family members in future don't get into same trouble.
Building Programmable Chips with Roadmap
Speaker 2So the idea was to kind of design or find out a very small device that I could give as a wearable to people and it can detect intelligently. It is intelligent enough to detect if somebody falls and intelligent enough to differentiate it from a simple jump activity or normal day to day activity. Right, right, right, right. That took us down the ai route, because only the deep learning algorithms are can be, can be advanced enough to make some differentiation right. But then that took me that out of needing to build your own chip because there was no chip available that could last for a very long time. One of the requirement I had that pete was that it needs to be uh, it needs to work on a simple battery or a small battery for me, and there was no chip available, so we started building a chip. Little did I know at that time that competing or building a large chip for nvidia is easy, easier sure building a small chip that can run a deep learning algorithm and still consume very less power.
Speaker 2That's actually more difficult task to do, so we, anyway, we had to. You know, we had to invent things. The problem about building a chip, Pete, is that the moment you have to build a chip from scratch, then it has to be a funded, big funded company. Sure.
Speaker 1Not trivial investment in time and money and people and I.
Speaker 2So this is where you will see we have differentiated quite a bit from most of the edge, most of the members of this foundation. I mean it's a good way to get into how we kind of become different. I have been doing chip for a very long time and most of my experience and part of my team's experience has been in microprocessors, very large microprocessors like for servers or even for small computers, but programmable chips, chips that can be where the application can be built or can be differentiated just in software. So hardware becomes a platform rather than being one application provider. So if you, if I mean when I started looking at it, I analyzed, or we as a team analyzed, if you look at past 40, 50 years of history, any company, pete that has built one or two, I mean one-off solution or one trick pony, so to say, where the chip is only useful for one application or maybe two or three set of applications Right, except in networking. Companies don't survive. See you, look at any asic company in past 40, 50 years.
Speaker 2It has never seen a decade, it has never crossed a decade boundary right right so the only companies you see around us that survive for more than 10 years, those are all companies that build programmable chips. Very important, that's one thing. The other thing, chip company because it is so the investment required is so much more pete. One needs to have a complete roadmap story. If you don't have a roadmap story, it's can be funded, one of funding, but it's not a. It's not a. It's not a company that can sustain for very long time, right?
Speaker 2So while trying to solve my little personal I mean, or my personal problem we had, I mean, we ended up with the vision to construct a programmable chip that can provide an entire roadmap. So we ended up building our own instruction set, our own compiler, our own whole software stack, kind of lots of, as you said, and many layers in the sandwich. Right, we ended up building everything. Wow so, but it took very long time. The price you you pay is that for building such a technology, the time it takes before we can get in front of customer is very, very long.
Speaker 1Right, right, yeah, we had talked about that. You guys are kind of pretty far down in the sandwich. So you know, getting the chip designed and taped out and fabbed and then designed in, and then that has to get shipped and deployed, you know, yeah, it takes a while and you need like a lot of perseverance and, frankly, a lot of funding to you know get there.
Speaker 2We have been able to control the funding requirements by doing a lot of innovation, even in implementation and execution pete. So we our burn has been in the range of 10x less than any other of our colleagues or other companies that have done the same thing. In addition, we have been able to build a core pete that can actually build, can build many, many chips. We have such a roadmap and we have customers that are already signing up to that roadmap, so that's a good thing.
Edge AI Gravitational Pull Phenomenon
Speaker 1So you're not only ultra low power, you're ultra low burn on your. That's a good, that's cool. Yeah, I think this whole area of you know, when you kind of the magic combination is, you know, edge AI plus battery power plus a little wireless connectivity, that sort of turns this into a very interesting technology that's very deployable wearable, implantable, hearable, whatever you want to call it peel and stick, you know type of stuff. I mean we see it everywhere from like safety and security. So we see cities deploying like pedestrian safety cameras that are solar powered. You just strap them to a light post.
Speaker 1And then, but now even in the personal space. I mean, a lot of us have the watches and the wearables, but you know, I think anyone who's had uh folks where are, um, especially aging in place type scenarios. You know there's a lot of potential there for combining ai in a very focused way with battery power and connectivity to kind of keep people safe and keep people alert as to what's going on. So I think that's going to be a huge space.
Speaker 2Actually what we see, pete, just to give you what we are seeing right now, this low power story is such that it actually keeps growing very big. What we just experienced in the past one year or so is that people who were building models that were being deployed in cloud, they want it into their floor. The ones who are deploying in their floor, they want it on their desktop. The ones who are deploying in the desktop, they want it in the laptop.
Speaker 2The ones who are deploying in desktop, they want it in their smartphone or going down to almost a ring so what it means is, if I can provide a low power, that means I have ability to multiply the number of core, and whatever function was available in a cloud I can bring, bring it. I mean, I can basically serve these customers. So the low power is the extreme, edge to edge, to mid edge to whatever you want to call it, and what we see is that actually, the applications Pete, actually shrinking, and whatever was a non-edge application is also becoming edge application. Right, right, yeah, there's this gravitational. I call it the gravitational. Yes, yes, whatever was a non-edge application is also becoming edge application.
Speaker 1Right right, yeah, there's this gravitational. I call it the gravitational.
Speaker 1Yes, right where it's like everyone starts their project as an experiment on the cloud and then when, as they commercialize it, it gets, it has to get closer and closer to where the data is created ultimately, and there's just inevitable pull in that direction. Um, and for cost reasons, power reasons, latency reasons, privacy reasons, whatever. It's just inevitable. And it's funny there was a big discussion on LinkedIn.
Speaker 1Someone had posted about oh, there's not enough data centers and it's hard to build data centers and we don't have enough power for data centers, enough water, and it was like, yeah, I mean, that is not sustainable. Like those are going to become training centers and the inferencing is all going to be happening in the real world you know where the data is created and that that actually then actually pulls back some of the requirements on some of the data centers, right, so if the data centers become more training oriented and less inferencing oriented and you move those workloads out, then I think that's just inevitable. And it's whether it's you know years, how many months, years, whatever. I mean it's already happening with a lot of the wearables, but it is kind of inevitable it is pretty much happening right now, pretty much all the.
Moving from Cloud to Edge
Speaker 2I mean, in our case, pete, we have our first chip that goes into small wearables like pendant, like a watch, like a ring, like on the running track or on things like that. Sure, the second chip that will actually go and build a very, very high speed and high performance cameras, you know. So very high performance vision, which can do inference even if you're going at 100 miles, 200 miles an hour, and still will be able to detect a very small object. So very high performance, high speed vision. That's our second chip that multiplied the number of cores many times and goes into a small edge devices like your access point, your laptop, your M.2 cards in your drones, as you said, developing them in cloud, testing them in cloud by the time of deployment. See, inherently, pete, nobody, especially organizations, especially businesses, they inherently don't trust the cloud. That's fundamental.
Speaker 1No, I agree, it's sort of a I wouldn't say necessary evil is too strong of a term, but it is tempting to lift and shift into the cloud to reduce your capex etc and so forth. It's a siren song I would say. But yeah, I mean, ultimately people just would rather have more control over their data and have it on-prem or just within their thing. You know, if they can, if the business works, but yeah so. But you know.
Speaker 2I agree, and the operating cost actually has been one of the biggest concerns, especially of customers, especially in USA, even in India, somehow, china data centers seem to be much cheaper, much cheaper. Uh, so people, even to save cost, they want to get out of cloud. Uh, other thing that is happening is that there's such a small number of cloud providers that the consumers are just scared that, oh my god, what if this, uh, my vendor decided to raise prices? I have nowhere to go Right.
Speaker 1True.
Speaker 2You may have heard that data centers seem to want to pay. Want people to pay penalty if you want to take your data out of there.
Speaker 1Yeah, there is a little bit of a toll. You know, egress and ingress is always a cost and, like you said, a in a in a running concern, especially in the economy, in the kind of a lower margin consumer space, these opex costs can really add up, um, that being said, by the way, I mean like there's lots of great uses for the cloud too, so time series, insights and like data lakes and like lots of cool stuff, um for sure, but sort of like the, especially when it comes to ai, like you were saying. It's like how do we, you know, really think about running the ai workloads where the data is? And, yes, use the cloud if you have to, but it's it's. I personally, I think I'm in, of course, I'm biased, biased, but the default is you run the AI where the data is. You don't ship the data to where the AI is.
Speaker 2That is correct. By now, we have met at least customers in three digits who have their applications running in Cloud Pete and all of them without exception, not one who said no, no, go away. We have our data, we have our AI running in cloud and we are very happy. We don't want to get on the edge. Not one said that we don't want to get to the edge. Yeah.
Speaker 1Well, it sounds like you guys are in a good position, so you're based in Santa Clara, you mentioned.
Speaker 2Yes.
Speaker 1And that's your primary operations. And where else are you doing business? Are you doing business across europe and asia, pacific, or what's your, where's your market these days?
Speaker 2semiconductor companies beat by default their global companies, like I mean. Just yesterday I was having a talk with one of my colleagues said look, semiconductor, by nature I mean. Unfortunately we were talking about geopolitics, but my point is that as a semiconductor or ai data scientist, you are a global sign, a global citizen. You cannot possibly think of yourself as being confined or confining your application only to one particular geometry or one.
Speaker 1No, that makes sense. I mean, it is a global supply chain, it is a global ecosystem. You can't get out of that.
Speaker 2The investment required is so much Pete that there is no way a semiconductor company can survive if it has to serve only one geography. It just maybe china, but that's pretty much it, yeah market that's makes sense we have right now customers pretty much in three, four continents. Three continents actually. Three continents, actually four continents.
Speaker 1sorry, wow, usa, I mean North America, europe, asia, australia, oh, Australia, yes, yes, so I was going to ask you so for people that want to get involved with your platform, want to get hands-on with it, or how do they engage with you If they're a developer or a device builder, like, what's their? You have dev kits. Do you have what's your?
Development Tools and Future Plans
Speaker 2Yeah, so we have a full-fledged dev kit available. They can just go on our website order a development kit. We have a full stack software available, pete, wherein people can, if they have done programming in embedded systems using any of the embedded system platform based on ARM or RISC-V, and if they know they've done deep learning programming using Python, then they are good to use ours, so they don't need to learn anything special for using our chip, but they have to know data science.
Speaker 1Yeah, sure, I mean, you know the basic stuff, but you have a tool chain, a pretty robust tool chain. You're leveraging, I assume, pytorch and some of these frameworks for portability. Yes, did you mention also that you have an M.2 module. Is that something that people can In?
Speaker 2future not yet. We'll have it in future. I see it's kind of a little bit away. In our case, pete, we are building them with customers only means. I mean, unlike a lot of other companies who went ahead and built the M.2 card or this type of devices and then went hunted for the customer, we are actually going and engaging with the customer first before we build them.
Speaker 1Yeah, the. You know, I was talking to someone about customer. There's well, customers say they need, and then there's customer needs and those are different. But, understanding customer needs is probably step one before you do anything Absolutely. But yeah, we've seen that before People invent cool stuff, they fall in love with their product. They go out there, they try to find customers for it and you know, surprise, surprise, maybe it's not exactly what.
Speaker 2That was actually part of our deployment strategy. Also pete, because see the most customers, they think or most customers will tell you. If I ask them, what type of camera do you need in future? Most of them don't really know what they want one year from now. They know what they want right now, but then two months later that can change.
Speaker 2Sure sure Now, anyone who is developing hardware. There is no way I can serve a purpose of a person who just has, you know, somebody has to be able to tell what they need one year later, but most customers cannot. So we had to. I mean, hardware companies have to take a bet on proving their technology and showing people what the technology is capable of doing. We had to do that on devices that can be built with less cost, with less money, because, because if you dump $500 million on building a device that doesn't sell now, you have a company that can only go one way.
Speaker 1Yeah, yeah, no for sure, and yeah, and, like you said, it's like you know, I think you guys are betting on the right trends, right Low power, low cost, AI, where the data is created, I mean. So that's the right bet, and you know how it sort of manifests itself. Ultimately, you know we'll see how that works out, but it sounds like you have really good vision for it and I've looked through your website and stuff like that. It's really I think you guys are really on top of stuff too, so it's really exciting to see where this, where this heads.
Speaker 2And just for the community, we are kind of in the middle of updating our website that will also show the complete roadmap of chip building, not just our first chip. So now we think we are at the cusp of announcing to the audience that we are. We have a bigger vision, bigger roadmap.
Speaker 1Great. Well, hopefully, if folks out there come and visit the website and get involved, if you come to one of our events, hopefully you'll meet someone maybe GP himself or someone and you can see the tech in action. And yeah, it's great to meet you here virtually. I look forward to meeting you in person. And yeah, it's been to meet you here virtually. I look forward to meeting you in person and, yeah, it's been fantastic. Thanks for your time.
Speaker 2Likewise. Likewise, Pete, and hopefully see you soon somewhere.
Speaker 1Sounds good. All right, talk soon. Thanks.
Speaker 2Take care, take care Pete. Okay, bye, bye.