Conversations on Applied AI

Joe Schneider - AI & Embedded Systems

July 07, 2020 Justin Grammens
Conversations on Applied AI
Joe Schneider - AI & Embedded Systems
Chapters
Conversations on Applied AI
Joe Schneider - AI & Embedded Systems
Jul 07, 2020
Justin Grammens

Thank you to Joe Schnieder for taking the time to chat with me on his history of applying AI using Embedded Systems. As a fellow entrepreneur, educator to the community, Arduino enthusiast, and someone looking at how to apply AI at the edge, this was an awesome conversation.

From his early times at John Deere to his current role as the founder of Dojo5, Joe brings passion, inspiration, and in some ways a much-needed levelheadedness ( is that a word? ) to the industry.

By giving back with his presentations and learnings are well worth attending. I hope that through this podcast you are able to learn about some of the new areas in which AI is being pushed to the edge with smarter devices being able to interact and then react using data from sensors in real-time.

Finally, if you are interested in learning about applying AI, be sure to join us at a future Applied AI Monthly meetup and help support us so we can make future Emerging Technologies North non-profit events!

Here's just a few of the many fun and interesting topics discussed during this podcast:

Enjoy!
Your host,
Justin Grammens

Show Notes Transcript

Thank you to Joe Schnieder for taking the time to chat with me on his history of applying AI using Embedded Systems. As a fellow entrepreneur, educator to the community, Arduino enthusiast, and someone looking at how to apply AI at the edge, this was an awesome conversation.

From his early times at John Deere to his current role as the founder of Dojo5, Joe brings passion, inspiration, and in some ways a much-needed levelheadedness ( is that a word? ) to the industry.

By giving back with his presentations and learnings are well worth attending. I hope that through this podcast you are able to learn about some of the new areas in which AI is being pushed to the edge with smarter devices being able to interact and then react using data from sensors in real-time.

Finally, if you are interested in learning about applying AI, be sure to join us at a future Applied AI Monthly meetup and help support us so we can make future Emerging Technologies North non-profit events!

Here's just a few of the many fun and interesting topics discussed during this podcast:

Enjoy!
Your host,
Justin Grammens

Joe Schneider :

started to realize how how much value embedded systems are going to add. And really, I mean, it really was a lead into ai i think and all of all of this other kind of like replacing humans, you know, really difficult problem that humans are, you know, somewhat well suited for. But the problem is humans don't don't run 16 18 20 hour days very well.

AI Announcer :

Welcome to the conversations on applied AI podcast where Justin Grammens and the team at emerging technologies North talk with experts in the fields of artificial intelligence and deep learning. In each episode, we cut through the hype and dive into how these technologies are being applied to real world problems today. We hope that you find this episode educational and applicable to your industry and connect with us to learn more about our organization at applied ai.mn. Enjoy.

Justin Grammens :

Welcome everyone to the conversations on applied AI podcast. today. We have Schneider, founder of dojo five, a consulting services firm of embedded software experts how to modernize the outdated firmware lifecycle. Joe is a father, friend, mentor, and personal and corporate coach and is passionate about improving the state of the art embedded electronic device development. Thanks, Joe, for joining us today. You're welcome. Thanks for having me. Yeah. As a fellow entrepreneur, and technologists, you know, I have a lot of respect for the work that you do. Maybe you want to start from the beginning, give us a little bit more background on yourself and the trajectory of your career. Yeah, you bet goes all the way back to when I was really young. I just love I got into those Radio Shack kits. We were we did electronics, and I knew from a young age that electronics are gonna be fun. And then I got into programming. I still have the box from Turbo, c++ 1.0. And I bought, which probably dates me a little bit when I was, I don't know, I saved up my allowance. This is the kind of nerd I was back in the day and just found a love of programming too. And so I wanted to find a way to marry these two worlds together pretty quickly of the cool stuff that I could make happen with programming but

Joe Schneider :

Then also bringing that to the physical world. And that's where I really started to find a niche in embedded. And so in high school, I started playing with that a lot. And then in college that really took off and then had all kinds of fun through through college and then did electrical engineering undergrad and computer engineering grad. And then I also really found this love of teaching really in grad school where I helped redesign their microcontroller class at Iowa State at the time, and we did all kinds of stuff with FPGAs and fun hardware software code design stuff, which is kind of hot in the day. I don't know if it is so much anymore, but that was a cool piece of research back then. And once I got done there, I headed to john deere, they were close by Iowa State. Are you from Iowa originally? No, no, originally from Fargo okay. And Phoenix international, which is now a division of john deere up there. They were kind of the john deere post up there but didn't really think too much of it. I was like, What does farming have to do with anything technology? I didn't I grew up a city boy so I didn't really know too much about it. And When the john deere guys came through our lab and they said, Hey, we're doing some pretty cool stuff may should come check it out. And so I went check it out, I was blown away by how agriculture was getting just revolutionized by precision farming, john deere back in like 1994 1985 had started applying like GPS justice, it was really coming on the market, you know, to being able to do some of this precision farming really, the idea is like, can we use data from the inputs that I'm putting into the field, the outputs that I'm pulling out of the field, and the actions that I'm taking while things are growing, to just get better yields really is all it comes down to I reduce my costs and prove my profit, sir. And so they had an old monochrome screen, you know, LCD kind of thing that they that they had put together and was kind of running the show for almost a decade there. And then in about 2000 2003, they had decided, hey, we need something that gives us a little better user interface. And I think we're starting to see the benefit. That's, you know, just after the.com boom, you're starting to realize that hey, this technology and connecting You know, all of these devices and all this data is really going to be valuable. And so I got in pretty early a john deere on that project. And they were starting to do GPS driving, autonomous driving of the tractors. And this was, you know, decade before it really started hitting on cars, but it's an easier problem. Sure, you know, it's there are bicyclists and grandmas. So, you, you know, it's pretty much a giant open field. So if you get off by a little bit, which you'll lose the crap, which isn't great, but at least you're not hurting someone. Yeah, cuz they're up the

Justin Grammens :

road, turn around and come back. Right.

Joe Schneider :

Exactly. Right. And it's Yeah, I mean, largely, the problem was driving a straight line, which actually is way harder than you imagine if you've never been on tractor to field. Like in my mind, before I worked at john deere like fields were very flat. They're basically parking lots with soil. And if you've ever sat in tractor while the guys driving through, like zero fields are like that. It is a bumpy ride. And that's why you know, the fancy john deere tractors have Like nice hydraulic loaded seats. So it's it's bumpy for the tractor, but not necessarily for the driver. And it's actually very amazing what humans do to drive in a straight line. And if you think about corn typically is planted like 30 inches apart. So you can't be off by more than about 10 inches either way, when you're harvesting the corn, otherwise, you're driving over the crowd, right? It's just kind of amazing. Imagine like driving through a skate park on your car, trying to drive a straight line within 10 inches. It's just It's amazing. Yeah. And so I think it was really interesting, I think, where I kind of first started to realize how much value embedded systems are going to add. And really, I mean, it really was a lead into ai i think and all of all of this other replacing humans, you know, really difficult problem that humans are somewhat well suited for that the problem is humans don't run 16 1820 hour days very well. And in in agriculture, that's typically what happens you get a harvest window and then you gotta wait when the moisture is just right, the weather's just right. And then you also sudden you've got 5000 acres you need to harvest. And those guys run really hard, really ragged, and they're good, but not to that level of endurance. And so it was really kind of eye opening to see what we could do with GPS at that time, which was sub inch accuracy. They use two bands of the GPS at the time was usually restricted for military use. And so they, they there's an exception for agricultural abuse. And with those two bands and the corrections that were available, they could drive these things with sub inch accuracy. It's just just amazing how fast that controller needed to run. But it was fairly dumb as PD controller, right, that was just running on this thing and activating the hydraulics and there's a lot of work that went into making that work. Well, so this was early 2000s, right, exactly. Mid 2000s in there, okay. And they

Justin Grammens :

had that good accuracy at the time, and were they actually releasing this thing to the market or when you were there was it sort of in a test phase?

Joe Schneider :

Well, when I was there, it was running late running late as most important projects. They wanted it on the market, but there was a lot of work to do. So that was a really exciting time to be there. As we built up a team, the team, when I got there was about 20 people, they already had the initial implementations of the GPS software and some of the other settings. This really became like the infotainment center for the whole tractor in your car, you know, you've got the H HVAC controls, and you've got the radio controls. Well, in a tractor, like you multiply that times 10, because the transmission has like 50 different settings, and you've got hydraulic controls for the thing that you're dragging behind you in the back and you've got 20 lights instead of two, you know, yeah. So it's, there's, there's all kinds of different controls that you need to be able to mess with. And so they were trying to build this all into a system that would really modernize the tractor cab.

Justin Grammens :

Do you think at that time they were thinking about the data and what was possible, they were just

Joe Schneider :

starting to? Yeah, I mean, by 2009, I would guess somewhere in there. 2008 2009 they had a cellular gateway that was shipping data back to the mothership, and they were starting to build out a web environment that farmers could start to understand what was going on in the field from a very early stage even on the monochrome display, they had the idea of prescriptions where you would remember where you went. And then remember what the yields were in for certain parts of the field that were low yielding, you would remember that for the next spring and then spray more fertilizer or maybe get the soil tested or maybe not just decide not to plant there for next year. And so this is called the prescription, you prescribe fertilizer for that based on that plan. And then you would drive around and the device would actually control the fertilizer spray. So it would spray extra for you when it was in an area that it needed more and a little less when an area that didn't need as much. I think they were starting to realize that but it's a whole nother level. Now you know, even in the egg again, there's some really interesting problems with like a lot of these people are very remote. And like I mentioned, they often have a very small window that they need to get working in so the tractor breaks down on day one of three that you have to harvest your crop, you really need it fixed now, and that could be a two hour drive or a three hour drive from the nearest technician and some of those problems require a technician to come out. And so they were also starting to build out their remote onboard diagnostic stuff. So you could actually call up the guy on your cell phone, he could see what you were seeing, he could maybe help you through three screens and figure out what to do. Or you can see trouble codes on your tractor just like you can on your car now, it's kind of the OnStar stuff was starting to hit at that time. I think, too. So sure, you know, everybody was kind of that was like the level that we were working at, at that time for supplementing humans and reducing the distance between, you know, the people with the knowledge and people are actually doing the work. I think I just seen that accelerate. Exciting. Yeah.

Justin Grammens :

Awesome. Awesome. So where do you go from there?

Joe Schneider :

Yeah, so we did a lot of fun stuff at john deere, including a big agile transformation. That was a lot of fun to my love of building teams and helping teams learn helping teams become more efficient and, and kind of in the process all kind of got born there as well. We eventually did an agile transformation of like 400, folks, because the team had had grown quite significantly at that point. Yeah, after all of that, I got moved around by my wife a little bit. I've been working remotely for a while and I was like, I like people too much. I need to I'm getting sick of being in the basement, you know, funny enough as we're recording this here, you know, we are working through the COVID stuff right? stuck in the basement again, right? Yeah, you know, so I said I need to find something else. I had become aware of this company in the cities which No, my wife we as a family had already landed in the cities and become aware of this company called logic PD that did contract services engineering, and sound like a lot of fun so I jumped in it was a lot of fun for the couple years that I was there worked on a whole variety of different projects basically got an intro and how contract service work happens made a lot of great connections that have now since like, kind of dispersed throughout the Twin Cities. So I my network in the city is was kind of born that way. We feel really lucky to know all those folks. There's just a huge group of really great engineers that was there at the time, and so then moved on to InVenture. After logic PD and adventure was another Really, really cool company still doing a lot of stuff. They're a mix between a venture capital group and biotech kind of sass services group. So I kind of took a step away from embedded for a sec, and did a little bit of web development, both front end and back end and cloud type of stuff. And it really was good. As you You and I are seeing now those whole pieces are becoming more and more tied together with devices and this whole IoT, and how that all kind of is playing out is you have to have all of these pieces together to really have a fully usable kind of solution to somebody's problem. And so that really gave me good visibility into that those guys are still doing some really amazing work in AI and deep learning, I guess I'd say I got some exposure to that as well there. They have a company called m bio that's doing a staging of COPD and some other types of lung diseases. And they have basically some IP that can do this in a very specific way. And it helps people I mean, there's a number of different applications for that technology. And I think you've more so with all the COVID stuff happening now and there's just a lot of interest in how do we scale? Like if we're getting people lung imagery out a CT scans daily, thousands of these, how are we going to have enough radiologists to figure this out? And not only that, but radiology in particular is I feel like a really good application of AI, because you'll talk to two different radiologists. And they'll actually come up with two different answers at times. And it is a little bit of an art to see what they find. And then radiologists are human enough times they'll miss something, you know, if they're looking for nodes or whatever, in long yellow trying to find cancer, sometimes you miss something little that actually has a big impact. And so I know that's been slowly getting applied for a number of years here, but it's just recently in the last year where the FDA has finally figured out how they can start to feel comfortable with this. You know, like the description of the AI like the neural nets and stuff. We don't know how it works, you know? It's a bunch of weights and you know, neurons and like out the other end we've tested the crap Part of it, it looks like it correctly classifies these. But yeah, how do you how do you prove? Because that's really what the FDA wants is how do you prove that this thing is going to work over and over again? How do we quantify the risk of this thing, either missing something or falsely characterizing something. And that's always been the challenge. And I think the medical community is really nervous about getting out of the loop as well, not feeling like all of a sudden the computers are gonna start to take over some of these decisions where they've had these human human interaction. What What have you seen in this space? Because I want to hear from you just a tiny bit on that.

Justin Grammens :

Oh, well, yeah, yeah. So I mean, I've interviewed a couple different people on the podcast here that worked for healthcare companies. And it's been interesting and the question I was going to ask you, which I think we'll get to is like, you know, how do you think this actually changes just the future of work for people, right? Are people going to be put out of work when it comes to radiology per se, I shared with the previous guests that you know, my so my, my dad was a cancer doctor for many, many years for his entire career. He's retired now. And so I've been finding, you know, these various stories about AI taken over his job. And his response back to me that I like to tell people is this He's like, but they don't have the human touch, right? We can't deliver the news in such a way, or be sympathetic with the people that are there. So his his feeling is use the humans for what they're good at, and use the machines to find a pattern recognition for what they're good at. And they can work complimentary. And I think that works in certain cases. I think in other cases, and like self driving vehicles, where maybe an entire industry like trucking could be completely upset, and and taken away. People don't have to drive trucks anymore. And some people might argue, well, maybe they don't want to drive the truck. Right. Maybe it's too mundane, but there are millions and millions of people that that's their livelihood is literally driving a truck. What are they going to do now? Right? Mm hmm. And so from my perspective, from when I'm talking to people is is Yeah, I think I think it depends on the industry. And what scares people is if you look back at the Industrial Revolution, the machinery and mechanics that came in We're literally just doing manual labor. Right? So you go from a plow that a human has to now a tractor that's doing it, well, this human now can get a job somewhere else, maybe they're going to the factory, they're making the tractor, right. So there's a way for them to change careers into a different field where they're actually going to be still relevant in some ways, right? The problem is, is now you basically have machines not only doing the physical work, but now you have machines that are starting to do the mental work. And so what scares people is, is now all of a sudden, what is my value anymore, right? I can't change jobs from being a truck driver to now all of a sudden, being a AI professional, whatever that means, you know, or basically, you know, writing algorithms, so these things can get smarter per se, that's too big of a leap. And so I think it really depends on the industry. You know, that's kind of my, my take would love to get yours on that.

Joe Schneider :

Yeah, I mean, and I'd say to the truck drivers, I mean, they Yes, they know how to drive a truck in a straight line and avoid obstacles, but that honestly, I think they're under selling themselves if they think that's all good. Do there's a lot of logistics that go into truck driving and planning things out what order I'm going to drop things off in, what orders Am I going to take? You know, and I feel like that's what we saw happened in agriculture is a lot of the farmers like even at egg fuse earlier this year, it was a farmer saying like, Hey, I just want to sit back in my couch at my house and click a few buttons and I want the you know, the robots to start harvesting my fields. You know, he doesn't want to be driving. Exactly.

Justin Grammens :

I think in that case it works you know, you're basically taking away this the boring stuff that people don't want to do.

Joe Schneider :

Mm hmm. Well, I mean, yeah, and if a truck driver loves driving on the highway then I mean, I mean, he's gonna find the the value of that to society probably having less and less value, he's probably going to get paid less and less for that. But what he what is still really valuable is like we still need to get stuff from point A to point B. And that's that's not going away anytime soon until we have transporters or something right. So we're we're gonna need people who understand the business side of moving things from point A to point B and let the AI take over the getting it there safely and the boring like run it 24 seven, I think there's a big opportunity for the folks who go all in on AI, their trucks could drive 24 seven, you know, we they don't have the eight or 12 hour, you know, limits and now they can provide a fleet that can move a lot of stuff at a lower cost. And suddenly, they have a big advantage, I think and really, then their value, the human value becomes that human touch and the things that the AI doesn't do well, which is organizing all this stuff. I don't know I think that all will just drive the humans towards the valuable pieces like we're saying and will automate the boring pieces and boring pieces. Yeah, so you're optimistic on the future of where this is all headed? I am definitely you know, we also here to like is AI going to take over is AI gonna kill us all a terminator style or whatever. And? I mean, I don't think so. I definitely think it's going to be used for war. It already is being used for war, I'm sure I don't know of any specific examples of that, but you know, I'm sure it already is, we seem to have a pretty good sense of it would be an idiot move to stick a gun on a robot today and just like send it out into the world, I think it'll still be an idiot move to do that, you know, without some kind of check and balance. I don't think necessarily AI is anything new to that it's not some kind of enabling technology that would, you know, force us into some kind of like mass disaster, all we need is COVID. for that.

Justin Grammens :

You were talking about using AI for destructive purposes. I mean, you know, they do have self flying drones, you know, they can basically end up taking themselves off doing missions and landing, I read a story about then these drone pilots were worried that they were going to be put out of a job, right? And what happened this this came out of a book that I read that for every drone pilot that lost his or her job, all that data that came back where they needed to analyze it, make the algorithm smarter, you know, track all the video, all that type of stuff. I think they said there were 40 engineers that got new jobs because of that on the data science side. So it actually was a net win. And what was ironic is what they said was basically this self driving drone actually caused the military to have to hire more engineers. It was a different skill set, obviously, but it actually was a net win. So some people think, well, geez, we're all gonna be out of jobs. Well, I just think it's gonna shift to your point, I guess, in some ways. Yeah,

Joe Schneider :

exactly. And it's not like AI is at the point anytime soon, where we just like, pour some AI on some problem. And instantly, it's solved and the computer just understands it, like the computer still only do what we tell them to do. You know, there's definitely magic to what happens with AI stuff, but he's still got to train it and getting that data set. You're still training it for a specific mission, and I don't see that changing anytime soon, either. That's an interesting question is like when or if ever, do you think we'll achieve sentient computers? Yeah.

Justin Grammens :

$10,000 question, I guess, right. I yeah, I'm kind of on the fence on that. You know, it's like who Who would have thought 100 years ago, you know that we'd be on the moon and, and we'd have all this amazing stuff, right? It was just completely out of the realm of possibility of what we thought. And so will we ever get there? I definitely think it's possible. But I just wouldn't venture to guess, you know, when my major thinking that I am always thinking about is like, how do humans continue to become relevant in this society, I guess, as everything becomes automated, and so how is our world going to change? And honestly, like, what's the world going to be like for our kids and our kids kids? Because it's, it's not going to stop, right? I mean, this the technology, people are going to continue to push the envelope, they're going to continue to try and use computers for good and bad, but everything is going to change and the what we thought the internet was, there's something else that's going to come around the corner in the next 10 1520 years that we are just going to be completely blindsided by and just like, oh, wow, I didn't know that this was even possible. And so is it basically, you know, brain computer interfaces or whatever, where they're communicating over the internet via all sorts of weird, crazy stuff. biological warfare that happens and there is a virus that we can't control, you know, all that type of stuff. But all of that puts us into this technology realm of how can we use technology and sort of stay ahead of it? Yeah, and apply it to today's lives. And if I'm right or wrong on the on the sentient thing, or you know, we reach the singularity, you know, whatever it is, how can we make the best of it? Because it's not like it's the end of the world. It's just gonna be a different world, I guess, in some way.

Joe Schneider :

Yeah. What do you think we're gonna have one giant Uber computer that owns the whole world, one of the latest x men movies is about that, right? Where it's like this giant computer that could tell the future and nobody has true choice. You know, nobody has the ability to choose. It's all been predetermined dabs. If you saw that on Hulu, I don't know. It's kind of an interesting,

Justin Grammens :

what I'm thinking of is you watch the Star Trek The Next Generation at all.

Joe Schneider :

Oh, yeah. That was a big PNG person. Yeah,

Justin Grammens :

there's the Borg. Right centralized rein. It basically controlled all the boards, right and So will we get to that type of world? I don't know. You can, you can always sort of ponder. Like, what if I think it's certainly possible, but I think it's such a long ways away. How can we derail it from getting that far, I guess, in some ways. Sure. Sure. So you were working on an embedded stuff at deer. You said you worked with this other company in the medical space as well.

Unknown Speaker :

adventure? Yeah.

Justin Grammens :

Where did it go after that? When when did dojo five start? And what was the impetus behind that?

Joe Schneider :

Yeah, yeah. So dojo five was really me starting to get a few little embedded contracts, again, that I was kind of freelancing on and then just realizing that you know what, this is an awesome opportunity and adventure. And those guys are wishing all the best, they're doing amazing work there. But I my love was still in that embedded space. And so I slowly kind of have moved on and focused on the embedded stuff. And I started getting a lot more work and just kind of realized my passion and said, like, Hey, this is gonna be an opportunity for me to start building a company around this that, you know, basically a tribe of people who believe that In the things that dojo five is now kind of embodying, which is really like modernizing embedded mobile really didn't exist more than 10 years ago, you know, basically, before the iPhone, it was flip phones, and there was firmware going on those, but not not much. There wasn't much of app development even happening. So it's kind of amazing to think about that. And well, yeah, we had web back in the 90s. But it was almost something completely different than what we have today back then. But But both of those have really just innovated so hard and actually embedded predates both of those. And it has moved slower in terms of tools. And maybe because it predates I don't know, because you got kind of this establishment, you've got these, you know, these folks that kind of got dug in, and it's kind of the COBOL of his days, I guess. It is Yeah, a little bit. But we see this huge wave of IoT coming and even like, I hope we get to hear is like, you know, ai on all of these IoT devices and capturing all this data and what do we do with it and how does that provide value to society? into humans, you know, and I think what I see is just this huge gap in the innovations that have happened in mobile and web and the innovation that needs to happen in the embedded space. And so there's, there's just all kinds of like, we almost have the playbook written for us like, well, this is how a web and mobile have done it. And they've, they've found these like great solutions. And here's a giant gap on the embedded side, let's fill it up. So that's where we found a lot of value with our clients is helping them they're realizing more and more that they're differentiated by software, same journey that deer went through, where you know, it used to be hardware and tractors and how strong is your steel and how big is your bolt and how much horsepower you got. And and that, you know, has largely become just table stakes and having a tractor and now the differentiator is really all of these smarts and how can you give people superpowers basically give these farmers superpowers they can't have 100 acre farm anymore and make a living? They need 1000 or 2000 or 5000 acres and now how do you scale Your farm operation and still stay profitable. And that's what all these superpowers give. And so that's largely what we are doing is we're trying to be the guide for all these companies to say like, you've got this increasingly hard problem on your plate, which is the complexity of all the software that's going into your devices, some devices, which didn't even have software before. And now you want to add software firmware connectivity to them. And now it's only getting more and more complex, we try to act as a guide to help them build systems that are going to be maintainable, going to provide a good solution and also bring their teams the tools and the knowledge so that they can kind of own it over over time so that they don't get stuck with us as an appendage to this device all the time. So there's this whole learning and growing piece to dojo five, it's really represents the dojo piece, and then the five piece My family has five people in it. I've got three awesome daughters and my wife and I, so it's really this family feel around dojo five of trying to treat our clients and our employees and in our community is family really with this intention of bringing modern firmware to the world.

Justin Grammens :

Perfect. Love it. That's cool, man. Very, very cool. And as you bring this modernized firmware to the world, you're making the products and stuff like that smarter for people. Do you have a recent project, I guess where you've guys have worked in and it doesn't really even need to be AI per se, but anything you wanted to maybe share that you've that you've done that makes these things a little smarter for people.

Joe Schneider :

Yeah, yeah, I mean, lots of different things that we've been working on, I think, I guess what we're seeing a lot is is projects that are connecting people to things that they love, or things that they want to maintain. I think there's a lot of data in business that's untapped. And we're finding a lot of opportunities. particle in the last couple of weeks announced their asset tracker, right? It's like that is still so valuable just like can I open my phone or open my you know, web based dashboard as a business owner and find where my stuff is at the recent IoT fuse discussion on smart construction. Mortensen big huge construction company was just saying what was it where dude, where's my nail gun? There's something that was like one of their internal things where they were saying, like finding the equipment on the jobsite is still like a huge problem and represented a huge opportunity for them to improve efficiency. So I feel like we're still like in like v1 of that, you know, it's, it's just like, Where are things? Sure, sure. It's like, we're not even getting to the point of like, you know, are we using those things efficiently? Like if we always had a nail gun instantaneously at our fingertips now, like, are we doing things in the right order, and we have the materials coming at the right times and trying to think of some of the other kind of construction setups. That's what we continue to see too, is that there's literally like ground level needs that are pretty basic, that were just at the tip of the iceberg. And I think like, you know, as far as AI and stuff, really in the embedded space, and what we're seeing, it's getting applied a lot to speech synthesis and speech recognition, and it's also getting applied to computer All tough problems that make these devices easier for humans to relate with and interact with, you know, probably goes back to a lot of our early conversation where it's just like, we're not these things are taking over, we're giving making them smarter so that they can interact with us easier. We're like tailoring them to us. And so I do think that while those applications have found a lot of success with AI, I do think we're on the tip of something in the next couple of years where we're starting, we're gonna start to see AI be applied to a lot of other situations, those are just really ripe for AI to be applied to. But now with TensorFlow light, you know, micro that I'm going to be talking about next week at the applied AI meetup and we're starting to see the opportunity to have battery powered devices that are running these things and, and they can't do vision. They can do some amount of audio streaming. But, you know, it's largely if you start to get into like predictive failures and stuff of equipment, another big like, like, let's just measure these things and figure out when they're gonna fall over. And can we predict that know that ahead of time, I think we're gonna see a lot more of that things like there was a guy on the abetted FM podcast, which is another great one for anyone who's interested in embedded and they interviewed the tiny ml guys. And he was talking about this really hard problem of like poachers in the jungle. And they're trying to figure out like gunshot detection, right? Or like even finding like lion populations or other things. How do you do that in a way that's going to meet a budget of like a nonprofit just trying to scrape by, but also help the world also a big impact for the world, you know, AI is going to start to unlock some of those things. Because instead of having to have data shipped up to the cloud number crunch, we're starting to get to the point where we understand the models well enough where it can detect a lion roar. Or imagine a baby monitor that isn't on all the time, but only when your baby's crying. You know, that's what I want to be I don't want to listen to that background noise. As I'm saying this, I have a you know, 18 month old or 20 month old I guess and we have that baby monitor. on all the time, and I'm just realizing I all I really care about is is she crying or not? You know, under? If she's fine, then she's fine. And she's not. She's not. So I think we're on the precipice of that. I think you also see a lot of stuff with the COVID. Somebody is talking about a coffee detector, right? Yeah. So could you build some cough detection into some kind of monitoring and figure out some kind of spreading metrics of viral infections or whatever. There's a lot of like little things like that, where I've said, it's not, it's not object detection. It's not driving a car, not some of these grandiose applications of AI. But it's these little things that all of a sudden you get a swarm of these sensors. And the data collected in aggregate is really adding a lot of value. Perfect. Yeah,

Justin Grammens :

no, that's that's that's great. A lot to unpack here of the things you've been talking about. I feel like there's sort of spans the gamut. You've got companies like Deere that started in 20 years ago now, experimenting around with this stuff, and it just takes a long time to move things forward. I think as technologists and stuff it just feels like kind of why aren't we here yet. It should be a pretty simple, simple quote unquote, to sort of solve but why do we not have trackers? What Why is this a new thing to have a tracker device? You know, it's 2020, for God's sake, like we've been able to track things for 20 years, why are we finding these devices out? And it's just it just takes forever in some ways, for I think the right business applications to sort of suss themselves out. And then also probably the number of things right there's there's price points, and how do you manufacture it and all that type of stuff, it's feels just feels like it takes a long time for us to finally get something done.

Joe Schneider :

Yeah, I think, you know, the cellular connections for these devices is really coming down in cost. And now suddenly, you can imagine devices being sub dollar, you know, or even sub 50 cents to be on a network and shipping up just a little information. IoT 1.0 was kind of like, well, we'll have a microphone, and then we'll just send the entire audio stream up to some giant server in the cloud, and then we'll number crunch up there. But I think people quickly realized that, like, there's a whole class of problems that will never get solved that way because the data cost is just too much. Yeah, the number crunching cost at the cloud is too much. So all this edge computing and then just, you know, just tell me if you hear lion. Yeah, you know, otherwise you just stay silent and you can have a small battery, you might be able to be solar powered. All that

Justin Grammens :

you remember the guy, his name is Topher white and he keynoted, one of our IoT fuse conferences, probably three years ago now it's been and he has a company called rain forest connection. Oh, yeah, I'll put it in the liner notes. But yeah, he, they were basically using old cell phones to detect logging in the rain forest. And yeah, same idea applied, right, don't want the cell phone on running all the time. It's really only listened to for certain pitches. And so they had built their own AI and all that type of stuff and pushed it out to these devices. And they should be listening for only certain things that they wanted to do, which was awesome. And again, that's years ago, I think I saw his name pop back up again. He's still doing it and they're still in business and they're doing some really really cool stuff. Just take some time.

Joe Schneider :

Yeah, cuz I was like, detect a chainsaw in the like cacophony of the Goal, right? Not an easy problem. Yeah, right. Exactly. It's it's again and then and not something we want humans doing. Right. Again, I think those are the great applications of the AI stuff. Yeah, I think, you know, one thing that we we've been recently working on, that's pretty interesting too, is a robotics application where we needed to be able to detect the difference between hitting a soft object like an arm or a leg and hitting a hard object like a wall or a table or something, again, really tough problem to build some kind of heuristic algorithm around the sensing that we have available to us is really just the current from the motor. When a motor stalls out for those that don't know, you know, have electronics background, when the motor starts to hit an obstacle, the current spikes, it starts to use more and more current. And so you can monitor that current and a lot of first level or, you know, first stage approach to this might be to just say, like, well, we don't, we're not gonna be able to detect what obstacle it is, but if there's an obstacle and it's there, and I guess we'll shut down the motor, or backup or whatever you need to do for your robot, but Being able then to detect from that signature, what we've run into, and then react differently based on those without any additional hardware. So like, there, you're starting to see the power of AI, because it's just a software feature that could be duplicated. Now on all of these robots, you think about some people in the cities even that have worked on things like monitoring water usage, or monitoring, power usage, and then characterizing that, you know, is that a toilet flush? Or did I just turn on the microwave? Or some of those have been playing around in the AI world for a little while, and I hit it. Again, I think those are starting to come even more of age, we're gonna I mean, those models are only getting better. Or we can start to really gain some real insights into energy usage and resource usage, I think, pretty exciting.

Justin Grammens :

Yeah, I mean, in that in that particular application, where you said, you're basically running into something and there's a spike. I guess the hardest thing is maybe getting enough data to make sure you don't have these false positives going on. Right. There could be other reasons as to why these spikes are Is that is that correct to say?

Joe Schneider :

Oh, yeah, sure, like I mean, even the robot itself, right, the friction between the different joints and stuff can cause some some spikes as it's moving. So, and then, of course, yeah, actual obstructions, even things as far as like the orientation of the joint on the robot can affect the way that that current is drawn. And so that that training data has been huge for us in that application to make sure that we understand, you know, give it enough training data and build the model, right, so that we can correctly detect this during this application.

Justin Grammens :

I was just thinking about a guy, I don't want to change the subject too much. But he, he was just he's working a lot in industrial IoT space. And they do a lot with motor monitoring, and sort of the holy grail, he said, was trying to do these predictive algorithms before the motor dies, it's going to have this sort of signature. And they've been working on this problem for a long time. And it's by no means solved, mainly because it's just so much, you know, so much noise, right. It's so hard for them to pinpoint the true signal and trace it all the way back because oftentimes, these things need to run For years, maybe and maybe even runs for a decade without even dying. So like, they don't even have enough data to try and prove when this thing is gonna fail. And so

Joe Schneider :

yes, there's a ton of data that they're pulling in, but it's still a really hard problem to solve. There are folks that are having success in that though, there's a company actually in town called Boone logic, I don't know if you're aware of them. They have unsupervised machine learning on devices. And it's it's pretty interesting at the edge don't have any particular tie to them. But they are doing some interesting stuff there. And I think it's, you know, they're not the only one. I think there's a lot of really interesting the technologies coming of age really where we could do that. And even in their case, they've got technology that allows it to learn on the fly in the field, you know, where you're not doing a single training set. That's, that's really cool stuff too.

Justin Grammens :

Yeah, for sure. I'd be curious to know is it very specialized to a particular type of motor or a particular type of solution, right. So because that's the been the problem a lot with AI. It's just been it's been so specific. Focus, it's very, very smart at that it's great at that the moment you feed it, something else completely is wrong.

Joe Schneider :

Yeah, I think we're finding ways to get around that. But I also think that's, that's kind of the crux of what we were talking about earlier is like, why I don't think AI is gonna take over and kill us all and replace us is that, I think, I think because that's what we're finding about these things is that it's really easy to train it to do one thing, but to train it to do, like, be human. A whole experience, I know a lifetime of experiences. And, and I agree with you, maybe we'll get it's, I'm kind of on the edge to about the whole sentient thing. But that's a really, that's a whole, like orders of magnitude harder problem than where we're at now. So it's

Justin Grammens :

definitely gonna be interesting to see how things sort of play out in this space. I did see a video recently where this computer actually coded, you know, they, they gave it some high level outputs that they wanted to have. And the computer actually wrote the whole program, so I'll include that in the liner notes, but I thought it was interesting. It had kind of learned how to kind of was able to stitch together a series of statements to get the output that It needed. It was kind of cool.

Joe Schneider :

Well, yeah, that's what I've seen is that whole fuzzing kind of thing where like, it just kind of throws a bunch of random data at it, but it knows the goal, you know, right. And it could it basically could just iterate so fast. It's almost like evolution playing out in code, right? And most of the time, it gets solutions that aren't anywhere close. But if it slowly anneals to something that could suddenly work then, okay, now we've got it. But imagine building humans by trial and error, you know, by random trial and error. And it's just like, Holy moly. Maybe you basically are reproducing whatever millions of years of evolution or something. I don't know yet. Maybe it's possible, but it seems like we're far away from that.

Justin Grammens :

Lots of compute cycles for sure to do that. So yeah, I want to I want to give you before we sort of wind down the conversation here, I wanted to give you time to maybe talk about I guess, do you have any classes or books or other stuff that you might recommend that people sort of getting into whether it's embedded AI if somebody's getting into this area, you said you're in education Obviously and where do you suggest people

Joe Schneider :

go? I think the Arduino stuff has been huge for embedded in education. It has never been more accessible than it has been. It's lowered the barrier so much that actually the big silicon companies now who usually were, in the past have been pretty opaque and and tough to understand and get into now, like their dev kits have Arduino shield footprints on them so you can plug and play all of this Arduino stuff on to the dev kits from these big silicon manufacturers. So it's kind of come full circle. You know, Arduino is kind of this underdog, kind of just for education at first and Now listen, you see professional people like prototyping with Arduino and even like, in some cases, kidding, like going to like friends and family types of rollouts with Arduino or even the Raspberry Pi stuff. So I think there's so much around that Raspberry Pi and Arduino world to get into embedded I think that it's just almost a no brainer answer I guess they found a good way to pull together a lot of different technologies all into a very easy to understand kind of base layer. setup. And so you can you can get into it, you can get a light blinking really quickly, which is like the Hello World of embedded is blink a light and then move on very quickly from there on dojo five, we're doing some learning activities on the Arduino ble since 33 board, which is a board that they've recently released, and some of the tiny ml guys or TensorFlow light guys have ported the TensorFlow stuff to that board. And they're very quickly I can have 64 megahertz chip which, okay, back back in the day was nice, but now is okay. But this thing can run it This can run on batteries, and it can understand whether I'm saying yes or no. And that comes out of the box with Arduino stuff that they walk you through how to treat train it if you want to train it for your own voice versus the generic training model that they've created. There's another example where you can wave it around like a wand and it can do gesture detection. Again, a really hard problem to do if you're just writing a heuristic, right? But with training, you just train it a couple times. It's really easy. Go through. So I think it's just so easy to get into whether it's AI or just embedded in general. Those are the two places I'd had. And there's Arduino has honestly exploded now It used to be that one board uno board right now it's like they have like, I don't know, hunters or something that you're getting into FPGAs and cellular and everything.