Autism Labs
Practical tips and evidence-based guidance to make life easier for you and your severely autistic loved ones.
Autism Labs
Part 1 AI for Autism Care: Daily Living Labs for Parents
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
In this episode of Autism Labs, Mike Carr speaks with futurist and researcher Mike Courtney about how AI and emerging technologies can be applied in practical, meaningful ways to support individuals with complex special needs. The discussion centers on redefining technology as the application of knowledge to solve everyday human challenges, not just advanced digital tools, and introduces the concept of “Daily Living Labs,” a community-driven approach focused on improving activities of daily living such as communication, hygiene, and independence.
The conversation highlights how caregivers, professionals, and technologists can collaborate to both identify existing solutions and create new ones using simple tools like smartphones, AI-generated learning videos, and non-wearable sensors that track sleep and physiological signals. These technologies can help build personalized learning systems, improve skill development, and provide deeper insights into behavior by identifying patterns tied to sleep, environment, and emotional triggers. Ultimately, the episode emphasizes a future where continuous, passive data collection and AI-driven insights support proactive care, enabling smarter daily planning, better outcomes, and a connected ecosystem where families don’t have to solve challenges alone.
If you're interested in joining our private Facebook community for parents and caregivers seeking residential options, guidance and peer support for profoundly autistic adults or adults with complex needs - click here!
Mike Carr (00:05):
Well, welcome everyone to another episode of Autism Labs. We have a really special guest today, someone that I've known for many years. He's a futurist, he's a university professor, an angel investor, a successful entrepreneur, and the list goes on. But I want Mike Courtney to tell you a litle bit about himself first and then we're going to talk about how to apply technology to really help those with complex special needs. There's a lot of tech out there today. There's a lot of AI. There's a lot of this, and that's. And the challenge is, well, how can I really use it? Will someone please tell me what are some applications, real world, that are going to be applicable to me and my kiddos today? So with that, Mike, please introduce yourself.
Mike Courtney (00:43):
Sure. So we have known each other for a minute, haven't we?
Mike Carr (00:47):
We have.
Mike Courtney (00:48):
And it's been great. So my background, basically, I'm a marketing researcher and a futurist, which simply is to say that the marketing research is surveys, focus groups, ethnography, things focused on now. And the futurist is simply looking at what the possibilities of the future might be. We don't predict anything. I'm not Nostradamus, I'm not Carnac, but what the possibilities are. And I love what you're talking about in terms of how do we actually use technology? But I think the first thing we need to do is reacquaint ourselves with the definition of technology. It's the application of knowledge and science to solve human issues, whatever it is we wish or need. And I think in today's world, everybody's talking about AI, robotics and those kind of things, but we can't forget the fact that things like AI will help us invent physical items that help our daily life.
(01:36):
And it doesn't always have to be an electronic thing, but we can use this tool called AI to help us think through the implementation, the strategy and possibilities that we can in the physical world bring to reality.
Mike Carr (01:50):
So one of the things that we talked about, I don't know if it was a year or two ago or months ago was a concept that you brought to the table called Daily Living Labs. And if you could just tell the audience, the viewers, the listeners, what that is and maybe your vision for that, just anything to bring that alive so people understand what we're up to.
Mike Courtney (02:09):
Sure. So Daily Living Labs is the name that we came up with to basically form a community for caregivers and individuals with disabilities to focus on the activities of daily living, what we call ADLs. Because while there's lots of organizations out there that are specific to a condition or a constraint, whether it's autism or cerebral palsy or MS or Alzheimer's, there's all those organizations exist and they do great work, but they're not innovation groups most of the time. They don't really have an innovation arm. And Daily Living Lab's purpose is to say, help people find the things that exist that maybe they're not aware of but would help build or invent the stuff that doesn't exist but would help and then integrate all of it so it doesn't get in each other's way and it actually is something manageable. So that's the idea behind Daily Living Labs is for the community to surface the issues that they have individually or in groups and clusters and have the community pay it forward and say, "I don't have that issue, but I have an idea.
(03:09):
I don't have that issue and I didn't have the idea, but I can help bring that to life and make it real because maybe I have some skill. Maybe I'm a woodworker or a welder or a 3D printing person or whatever it is that can play a part in helping improve things for others." And
Mike Carr (03:25):
I know you've worked with several universities. I know that you've got connections with some folks in Georgia and more recently you're working with some folks at University of North Texas in the Dallas-Fort Worth area. What have you seen in terms of just basic simple applications to solve real world needs? Because I think some of the parents are just interested in, I don't necessarily have the bandwidth right now to take on a heavy lift when it comes to perhaps using AI and going through a learning curve, but I know you've come up with some things and some ideas that seem pretty straightforward and pretty obvious once you hear about them, but if you've never heard about them before, it's not that obvious. Do you have examples of one or two things that come to mind there?
Mike Courtney (04:09):
I think we've all observed throughout life that when we look at things with a different lens, we see things in a different way. And sometimes those of us that are in the weeds, the force for the trees example. So those people who are caregivers and/or individuals with disabilities say, "I'm just trying to keep my daily life working and things moving along." Somebody else is an outsider like me looks at and says, "Why haven't somebody solved this or why doesn't somebody just do that? " And sometimes people respond with, "Because that wouldn't work." But sometimes they're like, "Oh, we hadn't thought of that. " And that's really the inspiration behind Daily Living Labs is along the way when I've done research for various clients along the way, research into wheelchairs, research into blind or low vision, hearing disabilities, all these research studies I've done throughout my career, I ended up having little extra comments at the end going, "Hey, by the way, I know we just talked about this for a study, but by the way, I noticed this other thing and I'd come up with ideas." And sometimes those ideas were valuable.
(05:06):
So what we've done with the universities this past year is do some of the foundational research and just to try to see, does the work, can we actually interview or observe people that have challenges and go, "Oh, we noticed this, this, this. What about this as a solution to a piece of it? " And so one of the things that we did this past semester with Professor Kasini, the Texas students, was just that. We said when we interviewed you and Kay about your son, Michael, said, "Hey, he reacts really positively and is really engaged when he sees himself on a video screen." And so we got the idea, hmm, maybe we could use AI and technology to take his image and create images and videos of him doing things, maybe things that you want him to do, skills you want him to learn or be open to experiencing and we created it.
(05:53):
Now granted you saw it's sort of version 1.0. But the great thing about technology right now is moving so quickly that version 1.0 can very quickly evolve to version two, three, four, five, six, and 10 and go from, "It's okay. It's sort of rough around the edges to, wow." In a very short period of time, again, using the power of a community that's all growing in the same direction.
Mike Carr (06:15):
Yeah. And just so the folks listening get an idea of how this might work in version 2.0 is all you need is a smartphone. That's it. And so who doesn't have a smartphone in your household and you need a picture, one picture, maybe a few pictures, but let's just say one picture of your special needs teenager, special needs adult child and if you want to, maybe some audio, not a whole lot. And if they're nonverbal, you say, "Well, how am I going to get audio of my son or daughter who really doesn't talk?" And our son's sort of in that bailiwick, he has very few words, but over a period of a day or two attentionally recording, we were able to capture enough of how he sounds and how he would speak. And you load this into this app that Mike and his students have developed and now all of a sudden he can watch himself say words in his voice that he hasn't yet spoken.
(07:00):
So from a speech therapist standpoint, teaching language and building in the smarts into the app so it actually can listen to him say a sound and make a suggestion or repeat it back to him in a different way or open the mouth so that he sort of can mimic himself in the way his face contorts or his mouth, changes shape or his tongue is positioned differently is sort of cool. And then the next step, Mike, that you showed us, which was also very exciting, was then take that into a video. So with text prompting, you can create a scene where your son or daughter is watching themselves do something that they haven't yet mastered. That skill might be folding clothes properly and they've never really mastered that, but that's an important skill when it comes to independent living down the road that they can do basic dressing needs, how to put on socks properly, how to apply deodorant properly, how to brush teeth properly and watching themselves do things that they don't really never done before, but they're saying themselves do it.
(07:57):
And then they're getting the cheers and the clapping and the encouragement in the background to us was really cool. And another thing you suggested to us, and I'd like you to talk a little bit about this, is the Aqara sensor, which is a wireless sensor that we now have installed above his bed. Do you want to share with the audience a little bit about that or any other tech related in that space?
Mike Courtney (08:15):
This comes back to the idea that for the past year or two, I've been telling people that when they encounter a frustration or a challenge or just something that they wish were better, the first thing we need to do mentally is just believe that it's already been solved. We just haven't found the solution, but it exists. It's out there. And don't be surprised that the majority of the time you find it's true. You look for something, you're like, "Wow, it does exist." And this was one of those situations where I said, "Well, somebody else had described, hey, I wish I had this or that. " And I'm like, "Well, maybe it exists." And we looked and lo and behold, it did. And that sensor is one of the things we look for. Somebody says, "Well, I wish I could detect these things, but the individual in my life isn't very compliant, doesn't want to wear something.
(08:55):
They don't like something on their wrist or on their leg." And we're like, "Okay, well, maybe there's something that doesn't need to be worn that could still detect those items." And in fact, this past consumer electronic show, CES, that happens every January out in Las Vegas and I've gone for decades all the latest technologies on display for all the buyers and retailers of technology. I go and I look around and I saw things like that that are really amazing, Michael, in what they're able to do, sensors that can at distance do things like detect your heartbeat, respiration, blood pressure without having you wear a thing. So the companies we've found that have these technologies, again, most people don't know they exist. They're like, "Well, it'd be great if, but it doesn't exist." Well, actually it does. And so we can put all these things together and now we can create a system.
(09:42):
We can create a home that basically is a caretaker, is a caregiver, even though it's the home. The home basically comes alive and can watch things remind, warn, notify, guide, communicate, because maybe it does create a video of what looks like you as an individual. Remind you, now put the socks on first, pull them up to here, then put the show on, then tie the shoe and it instructs them to remind them. But what if in the future, in addition to that primer, that thing that teaches them how to do it, what if another camera would watch them actually do it and say, "Wait, wait, pull the sock up just a little bit more. Okay, you got it. " Just the way a person would. And technology is increasingly capable of doing the things that a person would do. You as a parent or as a family member could say, "Hey, Michael, pull the sock up." Nope, pull it a litle bit farther.
(10:24):
Now put the shoe on. Wait, no, you're going to do this, do that. You can instruct. Well, AI can too, with the help of cameras and sensors and speakers and all these things that are relatively affordable and we put them together in a system, it's amazing what we're going to be able to do.
Mike Carr (10:38):
And you can start simple. I mean, one of the things that we did with the Acura sensor that Mike was talking about, which is very basic and the technology I think has been replaced and improved upon is, okay, you say, "Well, why do you care about what one's heart rate is or pulse rate is or whatever without them wearing a device?" And we put it over his bed and it measures sleep and any parent knows any human being knows that look, if you get a good night's sleep, you're going to feel better, you're going to have a more productive day, you're probably going to have fewer sessions of stress or concerns throughout the day. And with our son who's nonverbal, severely autistic, has seizures, pickup behavior, all kinds of other things we're trying to correlate, okay, good night's sleep, he has better behavior. Bad night's sleep, we better put the team on alert.
(11:21):
Hey, he may have a rough day. He didn't just not get his typical eight hours, nine hours. He woke up a couple times and it was a rough night for him. So it lets everyone know ahead of time how to set expectations and maybe how to make adjustments throughout the day just based upon that one metric. And then what Mike's talking about, which is so exciting is then you couple it into a database of sleep patterns and how to mitigate or how to accommodate behavior with the tool, the robot, the AI, your partner, if you will, making suggestions. So he had a bad night's sleep, but the weather's gorgeous outside today and your AI partner knows that, also knows that Michael enjoys walking outside and playing basketball outside. So he comes back and says, "Well, based upon his sleep and maybe some other metrics that we're tracking, like what he ate and some other stuff, here's maybe a protocol or some adjustments to today's schedule that probably are going to make a successful outcome for the team much more likely because it's nice outside.
(12:21):
So everybody wants to be outside and he had a rough night, so let's just have him walk more and do less mental exercises or vice versa. Mike, I know in some of the work that you've done with other parents and some of the things that you've talked about with the Texas students, you've got some visions and some dreams and some approaches. Are there any things that come to mind that, hey, even though we're maybe not there today, here's where I see the future going with an example or two that might excite some of the people that are listening or watching this.
Mike Courtney (12:47):
At the end of the day, we want to build a community, a community that is a place where people can go in search of answers, in search of help, things that they can't do on their own. And in some cases it will be just to help them find things that already exist. In some cases, the things they're looking for, the things they really need don't, but the community is really diverse. If we look at all the people that have somebody in their life, son or daughter, an aunt, uncle, a brother, a family member, a friend that has a challenge and then look at who their community is, look at who their family is. We've got doctors and engineers and lawyers and designers and software and hardware people, you name it. The community has everything it needs to address the issues. They just have to connect those dots.
(13:29):
So I think the exciting thing is to build a community that can connect those dots and not only be able to detect normal patterns that influence what our day or night's going to be like, figure out how we can course correct so we can say, Hey, somebody had a bad night's sleep, so therefore be aware, maybe do these things, but then eventually say, what led to the bad night's sleep? And can we nip that in the bud? Can we say, Hey, this combination of things means that this person may not sleep well, let's course correct it before they even get to bed. Oh, wow. I mean, because we all have those days, right? We all have those things where the straw that breaks the camel's back might be something relatively small. You're like, "Why did that ruin your day?" It's like, well, to be honest, it wasn't that item, it was the 10 things that led up to it.
(14:12):
So if we can eventually have all the data and be tracking all these things and say, "Hey, the 10 things that normally leads to a bad night's sleep, we're at item number seven. We're three steps away from having a bad night's sleep. Can we stop it now and maybe have maybe not a perfect night's sleep, but a better night's sleep?" So having all that data available so we can course correct and adjust I think is an exciting thing to be looking forward to. If we can build a community, have the technology, learn the patterns, and then take those solutions, those recipes, so to speak, and say, "Hey, it worked over here. Here's recipes that we can try over in this other environment for these other people. " Some will work, some won't, but it's additional things to try without that family having to build it from scratch and reinvent the wheel to just take it and say, "Apply it.
Mike Carr (14:56):
" Yeah. And one of the things you've mentioned several times that we've talked about a lot and I think parents are very interested in is you have to have the data. You're not with your son or daughter 24 hours a day, right? I mean, the therapists see them, you might have different therapists or different colleagues depending upon how much help they need, how much work they need throughout the day. And so no one person necessarily sees everything that's going on throughout that day. And even if they do, if there's a crisis moment in a day program and your son or daughter's with a bunch of buddies like our son often is and someone has a meltdown, well, the data collection, if it's manual, sort of stops at that point because it's all hands on board to make sure that the person doesn't pull their pants down in public or doesn't start screaming loudly and disrupt everyone at the restaurant or whatever it might be.
(15:42):
So one of the things I know you've worked with and worked on and we've talked about is making the data collection more automatic and more in the background. And so that sensor that monitors and records heart rate and perspiration and respiration and pulses and blood pressure without you having to do anything or the camera that's always on, whether your son or daughter's wearing a little pendant or glasses or just a button and it's always listening and hearing what your son and daughter is listening and hearing and seeing what they're seeing and then correlating that to spikes in heart rate or unusual perspiration or an event that happens so that AI can then go back and parse the video. What did they see? The audio, what did they hear? Oh man, when they heard that, that heart rate really went up. That either caused tremendous joy or tremendous anxiety.
(16:38):
Let's let AI listen to it and draw a conclusion and maybe draw our attention to it so the team can then listen to it and see if we agree or let's see what our son or daughters saw. And oh my gosh, no one saw what they saw, no wonder they went off or no wonder they ran over there or no wonder they got scared. What are you most optimistic about having gone to CES for so many years and just staying in touch with technology? What do you see on the horizon that probably is going to be available, affordable sooner rather than later versus things that you're just as excited about, but they may be a little bit further on down the path in terms of timing.
Mike Courtney (17:12):
Sure. So a couple of things you said really resonate and I want to touch on is that as we collect data with all these new tools and technology that are available, in some cases we can get confident enough to say, "Hey, when you see these things, here's what you're allowed to do. " The other things surface it so humans can intervene and say, "Hey, escalate us so that a human can make the decision." Because at the end of the day, technology's there to serve us. We are the ultimate decider as to what it does or doesn't do. We can give her permission for some things and other things, no. Let me know and I'll make the final decision. But I think when it comes to things like healthcare and just what's going on around us, because there's so many things that happen in any of our lives regardless of who we are and what our abilities are or aren't.
(17:52):
But I think one of the patterns we've seen over the past couple of decades is going from periodic measurement to continuous measurement. Many of us wear something today, either an iWatch or some other device that captures medical data that periodically used to ... In the past used to be something that you'd get once a year at a doctor's visit where they'd check your heart rate or your blood pressure or something once a year, maybe twice if you were really back there for something. Now we look at the benefits of just that, that, "Hey, I can see how well I slept every day. I can check my heart rate anytime I want and we now know more of what's going on so we can fix it and not just by fixing it saying, well, last year you were this, this year you're that. " So I think that applied to people with different needs again is really exciting because every day can be a challenge and the more we understand about the things that add up to a challenge versus result in a better day, that's what we want for ourselves, that's what we want for the people we care about is to be able to continuously monitor surface of things that are important to a human so we can intervene and make the people that we love that are in our lives have a better day.
(18:58):
So I think that continuous monitoring I think is going to be great.
Mike Carr (19:02):
Well, I want to interrupt right now because we're going to continue this discussion with Mike Courtney next week and we're going to talk more about his vision for the future, how to get ahold of him, some of the things that they're doing with respect to auditing the home and some other pretty cool things. So he's connected to multiple universities around the country and he's working with current students and families and looking for more families to join, no charge. So stay tuned next week. Come back if you're interested for part two of Daily Living Labs. Thank you.