EDGE AI POD

The future belongs to greasy machines that think for themselves - Anoop Balachandran of Tinkerblox

EDGE AI FOUNDATION

From cloud dependency to edge autonomy – we explore the frontier of intelligent edge computing with TinkerBlocks co-founder Anoop.

Imagine a world where your machines don't need to phone home to make decisions. That's the vision driving TinkerBlox, a startup founded by former Bosch Digital executives who saw firsthand how IoT implementations were hamstrung by excessive reliance on cloud processing. Their mission: bring intelligence to where the action happens – directly on devices at the edge.

The realization that sparked TinkerBlox came from observing industrial OEMs struggling with cloud-dependent architectures for their "greasy machines." While cloud excels for cloud-native applications, forcing these technologies onto edge devices creates bottlenecks, latency issues, and unnecessary costs. As Anoop explains, "Edge is tomorrow" – not just because of technological evolution, but because practical applications demand localized intelligence that cloud architectures can't efficiently provide.

What makes TinkerBlox unique is their approach to standardization and orchestration for the heterogeneous edge. Drawing inspiration from how cloud providers standardized server operations, they're creating reference architectures that respect edge constraints while enabling interoperability. We hear fascinating examples from automotive systems transitioning to vehicle-to-everything (V2X) communication and defense applications where drone swarms need resilient distributed leadership capabilities when individual units fail. Rather than selling directly to end customers, TinkerBlocks positions itself as the "secret ingredient" enabling system integrators and solution providers to deliver superior edge performance.

Curious about how edge computing might transform your industry? This conversation illuminates how purpose-built edge intelligence can overcome the scalability challenges that have limited IoT adoption. Subscribe to our podcast for more insights on emerging technologies that are reshaping how we interact with the physical world.

Send us a text

Support the show

Learn more about the EDGE AI FOUNDATION - edgeaifoundation.org

Speaker 1:

All right. Well, Anoop, thanks for joining us. Should I refer to you as ABC or Anoop? What's your preference?

Speaker 2:

Both works, both works. My nephews prefer ABC.

Speaker 1:

ABC. Okay, cool. Well, I mean, and you're dialing in from India and so I always apologize to folks about time zones and we tend to be California centric here or Bill V Washington centric, but thanks for joining us. Like we've had a really cool influx of startups into the foundation recently, including TinkerBlocks, and I think that's kind of like indicative of the opportunities that are happening out there. There's so many opportunities to I don't want to say disrupt, but, you know, to really start to fill in the blanks on the stack from metal to cloud, on the stack from metal to cloud, right and um, and I know you guys have like long history of of innovation in different companies and now you've come together on tinker blocks. Why don't you kind of give us the tinker blocks origin story kind of what's the? What's the, what's the spiel here?

Speaker 2:

oh yeah, absolutely so. Uh, so me and my co-founder both of us are from Bosch, so we were both handling the IoT for Bosch. One of the Bosch units called Bosch Digital. We were basically progressing the whole IoT effort for third-party customers of Bosch and the big thing, the aha moment, so to speak, was that we were consistently looking at OEMs as our customers Bosch being Bosch and German engineering Behemoth it always went to OEMs and that's where the resonance was with okay, can I also get products out like Bosch?

Speaker 2:

The big challenge that we found, both inside and outside of Bosch, was the fact that, even as we were looking very device native, looking at making products smart, devices smart, we were relying completely on the cloud to provide the intelligence so I wouldn't even say AI even something as small as a simple inference model, something very minute. Everything goes all the way to the cloud, or you deploy cloud-based technologies back to the edge. Now we worked with a few customers sensor book companies, some larger OEMs, industrial machinery OEMs, et cetera. Everywhere the problems was basically getting down to the same issue that it was all based on cloud technology, the cost of cloud ingress, the cost of managing that whole ecosystem. So when Bosch decided to take a pivot.

Speaker 2:

When Bosch had a new CEO, they wanted to kind of go back to the roots. We kind of took this aha moment that we had and said, okay, let's also go back to the root of where devices are expected to be intelligent and that's basically the site of the action, which is the edge. So we said, hey, we're going to be this very unique company that is going to focus on bringing intelligence to devices, letting there be an experience off of devices, but we will not bank on cloud-based technologies for this. I mean no disrespect to cloud here. Cloud technology for cloud is awesome and I love the way AWS, seo, google are going taking that further.

Speaker 2:

But when it comes to those machines, those hard, you know, greasy machines you need the intelligence right there inside of them or very close to them. And we said, okay, we're going to build out some new innovations, doesn't matter how long it takes, but we're going to build out a few things. That is very native to the edge and typical challenges. People say edge is constrained. Well, that's debatable With Moore's law and the kind of technology we have today. I wouldn't say it edge is constrained. Well, that's debatable With Moore's law and the kind of technology we have today, I wouldn't say it's really constrained.

Speaker 2:

Less constrained. But then there is a huge install base that is already on microcontrollers and stuff, which is constrained. And yes, there is the you know, very real concept that the more you put on the edge as processing power, as memory, there is an incremental cost associated with it. So we were basically like there was a time when the cloud hyperscalers came in. They brought in standards, they brought in alternatives to servers, they brought in a whole new experience to how to handle servers and storage which we could do with the current existing technology for edge native compute. That's kind of where we come from.

Speaker 2:

Our tagline says edge is tomorrow. We very strongly believe that edge is the future and there is so much week that is still unexplored at the edge and today our position, whether it is constrained devices, whether it is oems, whether it is hpc, uh related companies like automotive or data centers or oil rigs, uh, even space stations and defense so in all these cases we are like we're going to be that edge native infrastructure provider who's going to come in and say there is a better way, there's a faster way, there's an economical way to do things very native to the edge right, right, yeah, I like the.

Speaker 1:

I like the term greasy machines, because that that that is a fact. I mean these, these are, um, especially like when you talk about brownfield versus greenfield deployments and things people are. There's a lot of equipment out there that they want to add capabilities to and the that the idea of a rip and replace and we'll just build everything from scratch is kind of a pipe dream for most most real companies. Yeah, so so what you're doing is basically saying, like one of the things that's really proliferated the cloud and made it sort of like something that the fortune 500 depend on is kind of more standardized ways of kind of uh, developing and orchestrating workloads and storage and you know, these services have become somewhat uh, in the cloud, right, and so companies can write to them with confidence and speed and things like that.

Speaker 1:

And what you're saying is on the edge, as we know. I always tell people the heterogeneity of the edge is a feature, not a bug. It's one of the strengths. Of the edge is, like all of the you know, bespoke, you know equipment that that is used in the real world. So it's like well, how do we bring more of a uniformity to that orchestration for the edge that kind of the gist yeah, absolutely.

Speaker 2:

And uh, what is great about this? When we started doing this, we were all like, okay, how much can we do? How much can we push the limit? But what we have found over time is the number of people who are actually coming around us to help us do this. So we are working with a semiconductor industry some huge names to bring in reference architectures that are very edge native. There are cloud companies who are helping us to tell hey, you know, you're going to enable data and intelligence and devices that we would otherwise not even access, and we're there for you. Let us know where you need us to back you up, we're there.

Speaker 2:

We are working with telecom companies who's like okay, tell us what you need. Do you need network edge? What connectivity are you talking about? And we have foundations like yourself the AJA Foundation, sofi, for example who are basically coming in and saying we realize there is a set of foundational guidelines, architectures, et cetera that we have already set up and you're trying to do something new. That's something new coming up from the edge and without even considering the fact that we are just a small startup and we're just coming up with new technology compared to all these huge names that we're talking about. Everybody realizes that this is a win-win situation. It's not a zero-sum game with we take the share off somebody. Everybody's kind of pitching in and saying, okay, let's create the standards, let's make it easy and, as you said, so much reliable and easy for someone to adopt, just like they adopted Cloud Affiliates back. It's great, I mean, it's great to be in this ecosystem.

Speaker 1:

Yeah, I think it's like I had this discussion with an analyst the other day. And we talked about this at our event in Texas too recently. It was like how do we as a community better abstract for the application developer? You know the platform Because, as you know, in the traditional IoT space you probably remember this from Bot you know it's a lot of bare metal deployment. You know it's like writing this thing, it's a one-off and you ship it and forget about it, basically Because it's not really accessible and so.

Speaker 1:

But it's like, well, you know, taking lessons from cloud, like how do we better abstract the app so the app developers don't have to worry about the innards of the device that they're coding to? Necessarily, I mean, there's obviously constraints, but that is a key to sort of unlocking the market and enabling a lot faster deployment and development of things. It's almost required, I mean. Otherwise we're going to end up in the IoT I call it the IoT cul-de-sac or dead end where we have a bunch of one-off bare metal things that we struggle to communicate with.

Speaker 2:

So, yeah.

Speaker 2:

And just to add right, I mean, I think we had to really learn from the cloud guys. So, if you get down to the bare bones of it, what is cloud? Cloud is storage, a little bit of compute and that's about it. But if you really look at how GCP and Azure and AWS works today, they're practically giving you storage for free. They've built this whole ecosystem of experience all around that core thing called storage and they're enabling a lifetime interaction of any customer with multiple features that they pick and choose as they go through it.

Speaker 2:

Oems machine OEMs have the exact same opportunity to do that using Edge and cloud, so that they don't see of of it, as you said, as a capital good that I've just thrown over the friends but of actually engaging with my customer consistently, providing an experience where the device or the asset is just a medium for me to interact and give that experience to my customer. Uh, I think that and I've seen some oems who've really invited this and really want to go the full hog on this I think that's going to really change the way we see machines right and how we experience them do you see, like the?

Speaker 1:

you know, I think one of the it's called standardization or best practices of using like cncf based orchestration in the cloud is a big breakthrough, right? Do you see that? Do we need, like a CNCF for the light edge and edge as well, or how does that translate? How does some of the Kubernetes-style CNCF orchestration translate to the lighter edge in your opinion?

Speaker 2:

Oh, I mean fantastic question. So we see that definitely taking form. There is a trade off between how much do we use an existing system like Kubernetes versus using something a little more edge native, a little more free. And if you look at the impact that Kubernetes has got today in orchestrating containers, in orchestrating resources, it's huge right and it is important for any edge native providers such as ourselves to be able to latch on to this system so that our customers are not seeing a whole system change. So that's where I'd say yes to your question.

Speaker 2:

Where I would also probably say I would not be too strict about it is because we still see more and more innovations happening purely at the edge system.

Speaker 2:

So, for example, one of the projects that we are working on is a pure distributed cluster management system where there is random nodes that take the role of for lack of a better word what the Kubernetes control plane does. The master node essentially takes that role based on its particular task, its particular resources at that point in time and orchestrates an edge mesh around it. And is there a standard technology for it? I would say there's a lot of research. We ourselves are doing some research on it. There is research by multiple people on swarm intelligence who are going down this path. So I would say I think we have to keep enough room for innovation so that we could potentially get something really nice for the edge ecosystem. But, very true to your point, we have to get to that support for scale. Otherwise we're talking about this IoT chasm that we always find you do the pilots and once it gets to scale, it just dumps. So we have to have standards, no two ways about it.

Speaker 1:

Yeah, I think it's like sort of like thinking about the outcomes, like you want all of the edge devices to look like resources in the cloud, or at least from one place, right? So they need to be first-class citizens in a cloud architecture, a solution that's cloud architected, but at the same time you know anything below, let's say like eight gigabytes on the edge. I mean you can't really stuff. Can't stuff a container onto an MCU, it just doesn't work, and so so.

Speaker 1:

I think an eight and above. You know, hypervisor based. You know you can probably get away with some.

Speaker 1:

There's azure kubernetes service and other things and there's like the margo project with lf, edge and um, that's, that's cool, you know. That's sort of like you know, just hypervisoring a container and it's, and it's a cncf container, k8 kind of thing. But then below that, you know, things get weird and uh, you need to sort of think about, you know, I don't know if it's like K3s or you know, microej or there's like you know, and there's all these other kind of ways of sort of catching the workload that's, you know. So that's kind of the frontier, I think, is like you were saying, like how do we innovate, you know, to sort of enable those things at the end of the day to sort of all look like resources right?

Speaker 2:

Oh, absolutely. I mean, I'll give you an example, a couple of examples that really show us how this works right. So we are working with the automotive majors who wants to basically host so much more on their HPC? Now, the traditional approach, which has been going on for so long, has been very much on the zonal architecture, so on and so forth, but the whole we are seeing an explosion, or a potential explosion in the horizon, when we are seeing in-car or in-vehicle use cases expanding towards V2V or, better yet, v2i, v2 infrastructure or V2X the same use cases the way it just explodes and how every node, irrespective of whether it is a traffic signal or another car right beside you or your internal compute how to homogenize them and in a trusted manner that's where the standards come in, of course right, in a trusted manner to be able to execute tasks as a system or a system of systems. I think that's really coming through in a pretty big way and the investments into this space is exceptional.

Speaker 2:

While, again, we are also working with, let's say, defense firms, where they are looking at okay, I'm not going to talk to anybody else, I've got a swarm of drones and I need a particular drone at a particular time to take the lead. Like I was saying, the master node concept from a Kubernetes console playing scenario. But which one takes the node? We'd never know which one gets shot and which one bites the dust at any point in time. So having the redundant system to take over as the master node and all the other nodes, the pods realigning around the new master, to be able to give that intelligence, I think those kind of stuff is exactly where we are.

Speaker 2:

I mean, these are real right.

Speaker 2:

In multiple countries. This is absolutely real research and real pilot speosics that's going on. So I think when we are bridging these two scenarios of an isolated system which has its own intra-turmoil, versus a potential scalable system of systems with even unknown entities coming in and doing their part and going off, I think that system is where a lot of our internet, of everything which was stouted at some point of time. I think that's kind of where the real power is going to get to yeah, and so what you know you're, so you're a startup, you're you're.

Speaker 1:

You know, like any good startup, strapped for resources and time and people. What is your go-to-market like? How are you commercializing your stuff? Who you're selling it to? Like I can't imagine you're going to the some fortune 100 and ringing their doorbell and selling it to them directly. How does that work?

Speaker 2:

Oh no, we don't do that because, yes, there are a few pilots on very innovative OEMs with whom we are working directly. However, there are some very good I wouldn't say just SIs, just system integrators but there are system integrators, isvs, other product companies, so many of them who already understand this game and they play this game well. We basically come in from an edge native technology provider where we basically say if you are looking at an edge compute, whether it is distributed compute, it's AI on a single node. If it's about distributed cluster management, whatever it is, if you're looking at performance off the edge, we can give you better alternatives so that what needs to be at the edge can stay at the edge. What's on the cloud can move to the cloud. So better performance, more economical solutions and, of course, everything with respect to data security, with certifications and all of that. So we move, via all these providers, towards the Fortune 100, or rather 500 companies.

Speaker 1:

I see. So you're the secret sauce for all the solution providers. That's the goal.

Speaker 2:

We would like to say, yes, we have a secret ingredient that makes them run better. There you go. But then the other thing, and that's where you know organizations like the AJA Foundation, but also, like I mentioned, the hyperscalers, the semiconductor companies, where they're coming. In this, there is good reference architecture reference architecture usually solution-based, but there are good ones which are predominantly on the cloud system. We really need reference architecture with better edge capabilities.

Speaker 2:

Today, we are adopting the cloud systems and just putting them on the edge, which is, in our opinion, suboptimal. Just like me saying that I would be the best person to solve a problem of a fortune 100 company, to provide a solution, because I'm not. I am a technology company, I'm not a great service company. Same way, if someone came and said, hey, do you want to do a great cloud projects? We've happily passed it on saying, guys, this works on the cloud. I don't want to do retail e-commerce Gen AI. That's not my game. You bring it down to the retail store and tell me you want to do enterprise Gen AI within the limited resources of the store. I've got it, but not the one. So we are building these reference architectures to get back to your point, which can go out there, so that anybody with the aspiration to get higher performance off the edge can just adopt it and go forward from there.

Speaker 1:

so I would say these two are our primary tracks fantastic, wow, yeah, no, it sounds like you're in a great space. I mean, right on a no pun intended cutting edge of, uh, what's happening out there with edge ai. So, uh, I think it's, uh, you know it's, it's one of those things like you guys are touching on, a um, an area that's really required for Edge AI to proliferate. And you know, a lot of us live through the IoT thing and, like I said, we kind of hit that cul-de-sac or found ourselves in a dead end on deployments and commercialization. And these days, a lot of folks this is a big focus is how do I deploy and commercialize a solution to solve problems in the real world?

Speaker 1:

So I really appreciate that, the energy that TinkerBlocks is putting into this thing. So, yeah, so it's great to communicate today and get the story of TinkerBlocks. Hopefully I'll see you in person at some point. I know we're quite far away from as the crow flies here on miles, but I'm sure we'll be able to meet in person at some point. But I really appreciate your time this morning.

Speaker 2:

Oh, I appreciate it greatly, pete, and thanks for all your support. We need organizations like yourself to proliferate the concept of edge and the importance of edge. So, and yeah, I would be traveling to the US in August, so hopefully, fingers crossed.

Speaker 1:

Hopefully we'll get a coffee Sounds good, all right, thank you, take care.

Speaker 2:

Thank you, take care. Bye-bye.