
EDGE AI POD
Discover the cutting-edge world of energy-efficient machine learning, edge AI, hardware accelerators, software algorithms, and real-world use cases with this podcast feed from all things in the world's largest EDGE AI community.
These are shows like EDGE AI TALKS, EDGE AI BLUEPRINTS as well as EDGE AI FOUNDATION event talks on a range of research, product and business topics.
Join us to stay informed and inspired!
EDGE AI POD
EDGE AI BLUEPRINTS: Enhancing Urban Safety with Edge AI Approaches to Pedestrian and Traffic Management
What do rising pedestrian fatalities and the challenges of implementing intelligent transportation systems have in common? In this episode, we’re joined by Davis Sawyer from NXP and Mike "Witt" Whitaker from the City of Lakewood, Colorado, to unpack the complexities of applying edge AI to enhance urban safety. As Witt recounts his pioneering work with autonomous vehicles since the 1990s, we explore the fast-paced evolution of AI technology and its critical role in addressing real-world problems, such as the increasing need for pedestrian safety and efficient traffic management.
We take a hard look at the perplexing trend of fatal crashes involving impaired drivers despite a drop in overall traffic incidents. The conversation intensifies as we discuss the doubling of pedestrian fatalities, particularly at night, and the significant challenges posed by underreported impairment and the rise of fentanyl use. Our guests share insights into how collaborations with the police department and meticulous data analysis can reveal hidden patterns, driving the development of more effective municipal safety strategies.
Our exploration doesn’t stop there. We also discuss the advancements in detection technologies and the iterative process of refining systems like LiDAR and video to predict pedestrian movements more accurately. This episode offers a deep dive into the complexities and costs of deploying AI models in urban environments, including the importance of ML Ops, the challenge of model retraining, and the role of existing infrastructure in supporting intelligent systems. Join us as we navigate the intricate world of smart city solutions, focusing on the shared goal of safer streets for everyone.
Learn more about the EDGE AI FOUNDATION - edgeaifoundation.org
Thank you. Okay, here we are, we're back. We're back, davis. Welcome, davis sawyer from nxp. I'm pete bernard from the edge ai foundation. Welcome to our live stream today. Blueprints right. So we're going to talk about some cool solutions. I wanted to do a couple of public service announcements for the folks that are joining us, thank you. Just as a reminder, this is a live stream, so you can put your comments in there, whether you're on YouTube or LinkedIn or Twitch, and we will see them here. And we bring Witt on from the city of Lakewood today. We'll have some good conversations, but, yeah, a couple of PSAs. One is coming up. I'll just show this up here. We have an event called Career Edge. That's in January. We have an event called Career Edge. That's in January. If you hit that little doodah QR code, it's kind of mid-January. We're going to have a kind of an ask me anything about how to build a career in edge AI, with Stephen Davis, who's an executive coach, will be there.
Speaker 1:We have a hiring manager from WeTech there, which is a cool startup in AI space. It's hosted by 5V, so just kind of take a look at that. You can find it also on LinkedIn, but check it out on Discord or Discord channel. So that's a little PSA there. The other thing is, if you have not yet registered for Austin 2025, that's February 25th through 27th it's going to be a great. It's our annual AJI Foundation event in the US. It's three days of presentations and demos and eating and drinking and arguing about AJI. It's pretty awesome. Probably some good music there too, since we're in Austin.
Speaker 1:So, definitely go there. You can go to our website, you can register. There's a bunch of early bird discounts. There's like student discounts, tons of good stuff. So now is the time Book. Don't wait until January, february to do it. So look forward to seeing you there, cool. So those are my psas. Um, so yeah, today we're going to talk more about pedestrian safety, right that's it.
Speaker 3:We're moving forward. This is our second blueprint show so we had a bit of a hiatus. I mean, there's a lot of stuff happening in the world of ai at the edge, uh, you know, computer vision etc. So this is our second iteration of the show and it's also our second installment in the smart cities genre. So I think, after doing this blueprint stuff for a bit, we've seen some challenges, some hot areas, and one of the things, pete, that the working group has found is that there are some top verticals, there are some kind of top industries, top markets where blueprints, that is to say, commercially deployed or near commercially deployed solutions focusing on a specific problem, are starting to mature to a point where we can talk about them. So I think that's exciting and, yeah, could be back for a second show here today. And thanks everyone for joining us.
Speaker 1:Cool, yeah, and so let's bring on a wit. His full name is Mike Whitaker, but there's a whole backstory about why we call him with, so I'll let him do that. From the city of Lakewood, lakewood, colorado, interestingly, used to live in Bellevue, where I currently live. So interesting factoid, why don't we bring on Witt here to the stage? Hello, sir, how are we doing today? Good, good to see you, witt, good to see you. Excellent. So you're in Colorado right now, lakewood Colorado, correct, excellent, excellent. And so you mentioned you used to be in Bellevue, so you've been doing this for a little while. You're kind of like a veteran of all the beautiful city, technology, world, right? How long have you been in Colorado?
Speaker 2:I've been in Colorado about 11 years. I seem to have a history of staying 12 to 14 years at a job.
Speaker 3:They need you other places.
Speaker 2:Move on. I've been doing started out Washington State DOT and got early on into what we call intelligent transportation systems. I remember going down to San Diego and watching Buick try to, and got early on into what we call intelligent transportation systems, and I remember going down to San Diego and watching Buick try to drive cars autonomously down the freeway. You know, back in the early 90s, and so I've been fortunate to deal with a lot of cutting edge stuff.
Speaker 3:So, because you've been doing this for some time and the ITS acronym was actually it was new to me on our first show, but it did come up when City of San Jose talked about some of their work here. When did you first come across the term, or what you'd call now is Edge AI? When did it start being branded this way? When did you see some of these solutions emerging come across your desk?
Speaker 2:I would say it's only been in the last couple years really, and it's kind of been interesting because I don't I don't know if there's a real hard definition for it machine learning and then it's just kind of rolled into ai, um, and I'm not exactly sure what the definition between those two are, you know.
Speaker 3:Fair, yeah, and then I think that's it's an interesting point because, as we were just talking about before jumping on how fast this technology changes, and so I think, well, at some level, people need this parlance, these common terms, to generally refer to something that maybe they both understand, but, you know, there's enough commonality to describe it.
Speaker 1:Some of the things that don't change are the real kind of hard, vertically focused solutions, like terms like ITS, and every industry has its jargon, but I think that's where NGI is best adopted is when it's solving a problem that people don't describe as NGI, if that makes sense, right, yeah, exactly, and I can imagine in the municipal space and you'll talk more about this, and we were talking a little bit about aerospace before this you know the the development cycles and deployment cycles can be quite long, um, and the technology moves really quickly, and so it's sort of like how do you, how do you find a solution that doesn't become obsolete too quickly, that kind of fits your timelines?
Speaker 2:I guess that's another challenge yeah, figuring out, uh, what your deployment is once you're trying to solve, um, as you'll see, we're dealing on a pretty complex problem here in in lakewood, but it's across the front range here in colorado, and going back to my time there, um, for those people in the puget sound area, um, you know, it's essentially aurora, a street street similar to Aurora that you guys have. If you're from Portland.
Speaker 2:Oregon it's Terwilliger. If you're in Orlando it's kind of Orange Blossom Trail. Every metropolitan area seems to have one. I know San Jose we talked about a little bit, you know is where one of the devices we looked at originally was beta tested at and developed for and then kind of reused them to test. Here it kind of seems to be a problem as we talk more and more to other engineers across the country that you know everybody's kind of fighting a similar problem in their area.
Speaker 3:Interesting, right? I mean, that's that's ideally the magic of Blueprints when it comes to these solutions, right? Is you have a transferable solution to specific domains that has some commonalities but is repeatable process and, ideally, really can scale. And one thing I will say, whit, hopefully we'll find from the Q&A today, is that one thing I know about this community is we love complex problems. So I think you're in the right place and our audience has come to the right place.
Speaker 1:Yeah, we love complex problems, so this is the place to be Cool. Why don't we bring up your slides and then, so for the audience, whit's kind of gonna go through some stuff and then please throw questions as we go, and then Davis and I will be sort of hanging in the background and then we'll jump in with some questions as they come in. So I'll bring up Whit's stuff here, lakewood, colorado and uh, and then maybe Davis and I'll hang back and give Mike the floor. Awesome, sound good.
Speaker 2:All right, everybody. Uh, for those that don't know, lakewood, Colorado, uh, we're a suburb, essentially between Denver and the mountains. Here, um, this brownish area, which is really brown, is pretty much the plains and the cities of the Rocky Mountains, so we're the fifth largest city in Colorado. We're at about just under 160,000 residents in our area. To get a little bit why we're here, um, I've always looked at my career for technology to help us solve problems. Um, I'm only as good as one person, but if we get systems out there that are doing stuff, you know, it just scales the solutions and and really, you know, addresses a lot of things. So I know we got a lot of smart people here. So we're going to be looking for your professional expertise. But even on the outside of the work side, y'all drive, y'all ride a bike, y'all walk at some point and hopefully what you see today will make some of your habits change and you'll help contribute to solving the problem, even with your personal habits. And then some of you, you know, might make you think of some friends or colleagues. It's kind of an interesting time in our society, with a lot of the political discord, a lot of stress with, you know, finances and stuff and, as you'll see, people are looking for ways to escape. So that's kind of my tale of how the world is changing and and hopefully you become part of the solution, both professionally as well as personally. So when we look at crashes in lakewood over the time, you know, the green line at the top is our average daily traffic volume total. It's been increasing up to about 2016. And the average daily traffic continues to increase, but the bars are actually our annual crash totals. So you'll see, we're just under 4,000 crashes a year for a city of 160,000. And that was also increasing with ADT, which is pretty typical.
Speaker 2:Myself and Matt Duncan were hired in 2014. Took us a couple years to figure out how things were being operated and what we wanted to change, and so we made some pretty drastic changes. Things were being operated and what we wanted to change, and so we made some pretty drastic changes and, as you see in um, starting in 2017, there we're starting to get our crashes to to trend downward, even though, even with covet, obviously they dropped. But even as, as traffic has rebounded and continued to increase, we're still getting our our crashes overall to to decline, which is what we're still getting our crashes overall to decline, which is what we're hoping for.
Speaker 2:If you look at injury crashes, it's basically the same trend lines as all crashes in total, which is what we expect. However, when you look at fatal crashes, it's not following that same trend line, and so this really got us perplexed a little bit is why. How are we driving down crashes? How are we driving down injuries, but our fatal crashes are still increasing. So when you look, the orange part of the bars are impairment. So, whether it's alcohol, drugs, some type of impairment was involved in the fatal crash. We started to look. You know, our fatal crashes themselves actually the blue part, where they're sober, actually aren't doing too bad and they look like they are declining a little bit, but it's just impaired crashes that continue to climb, and so we had to really start digging into what's going on here. We looked at five-year blocks before and after 2017. You know our non-injury pedestrian crashes are about stable Pedestrian injuries have gone down, so that looked good. But then you start looking at pedestrian fatals and we almost doubled, and so something's definitely going on with the pedestrian fatals as part of our equation. So we continue to dig and we start to look at, you know, day and night. Daytime, pedestrian crashes are pretty steady. However, the nighttime, once again, is almost doubled and so you start looking at, you know, those crashes at night.
Speaker 2:Luckily, we've got a really good relationship with our police department and so we have access to all the police records unredacted, see all the information. We can go over and talk to them. We sometimes meet at their muster before shifts and and can talk to them and and get more one-on-one and and get the information flowing back and forth on. If an officer has been to the same intersection several times for crashes, he's like you know, I don't know what's going on, but I seem to always be going to this, this one intersection for a crash lately, and so we're just trying to increase that feedback to us and get it faster, instead of waiting for crashes to happen and then you know, a year later, look at our annual crash report and where did it happen in the last year? We're trying to get a little more proactive instead of reactive.
Speaker 2:So one thing you guys should know you'll see a lot of data out there in the world, and when it comes to crash forms, most states require those to be submitted within a couple days of the crash, and so the officer is going to put their suspicions that they feel pretty confident on the crash form. And so what we found out is, at least in the Denver metro area and other metro areas we've been talking with, the impairment data is vastly underreported. The impairment data is vastly underreported. They're not sure at the crash scene, especially if the person's unconscious, unless there's lots of paraphernalia or beer bottles, those types of things visibly there, and so they'll go in and they might not see any impairments suspected. But you know you'll get toxicology comes back weeks later from the hospital or coroner if it's a fatal, and you'll have that information as an addition to the case.
Speaker 2:But most systems right now have become so complex and leery of legal challenges that usually only the officer that was investigating can make edits to a case, and so that officer obviously has gone on and has been working on things for weeks. There's not really a good incentive for them to go back and edit their original reports and we've made it really complex bureaucratically and so very few officers are going back and adding it. The information's in the case if you really want to dig for it. But that information that gets sent to the state DOTs, it gets disseminated to the federal level. And so when you see the you know mothers against drunk driving, that's looking at national stats. To me it's it's. It's not good stats, it's garbage in, garbage out. They're not seeing the true stats that go with that. And so we've we've worked really hard with our PD and Matt's got a really good relationship with the coroner's office, and the Colorado Department of Public Health has become involved because of the hospital aspect of it and we're all trying to prevent these serious injuries and deaths. And so we started to dig more and make some connections.
Speaker 2:So when you start figuring out what's going on, if you notice the graphs, you know drug use, different drugs in the Jefferson County area, which is we're part of, and we call these unintentional drug overdoses from the county coroner that tracks this stuff and everything's kind of been going up a little bit for different drugs, but not too bad. Then you notice the green line in 2017 is fentanyl, fentanyl hit hard and, uh, the overdoses are happening. Um, those of you that know police officers or or are in areas where drug use is pretty rampant, you can always tell when a new drug has hit the street. It just changes behavior with some people and it's been interesting. But you'll notice pretty much in the, in the pandemic all the drug uses just went exponential and, as I talked about you know, we're starting to think that that's a lot of. You know the political discourse, financial problems. People have not feeling good about themselves, so people are looking to escape and feel good for a little bit, and so this is happening in all parts of society really. And so this is happening in all parts of society really. So we'll go to a real life to give you kind of a vision into what we're seeing.
Speaker 2:This was a crash on a state highway called Kipling 40, 45 mile an hour traffic. This is 1130 at night. This mother has basically left work and is going home to her family. She doesn't notice this pedestrian in the roadway right here at this time stamp. The shoe is just starting to get illuminated. About .6 seconds later a lot of the lower torsos illuminated. Still no brake lights. 1.2 seconds later you're starting to see most of the body. The pedestrian is actually now directly in front of the
Speaker 2:car. 1.9 seconds, essentially, the crash happens At two and a half seconds. You see your brake lights finally come on, and so this is classic perception reaction time for a lot of adults is that in an unexpected event, you're just driving down the road. You're not expecting anything to be in the road, you're just, you know, listening to your tunes and driving home your body to process, you know, to perceive that there's something wrong and then to decide what you're going to do and then to get your foot down on the brake. You know, it's that two to three seconds for most adults. So one of the things we started looking at is is there a way that we can make people aware, because the difference between an unexpected event and an expected event is about a second, and so we might be able to change some of these fatals to be just injury crashes, or maybe they can be avoided altogether if we can get that person to realize there's something ahead that they need to be alert at. And so you're going to travel, you know at whatever speed you're going probably at least a hundred feet for most speeds before you can actually even do anything, and so that's how we're trying to deal with it. You know, it's kind of like people that live in wildlife areas where you don't expect deer to run out in the street, or elk or bears or different things that happen every once in a while and so we're trying to get these unexpected events to be more perceptible and warn about them. In this case, it's what we were talking
Speaker 2:about. On the right You'll see where what was in put into the crash report from the, the officer you know didn't expect anything, and then when you get the results back from the coroner, you know this person's at 380 nanograms per milliliter of methamphetamines. You know two to 600 nanograms If you're a traditional meth abuser, you're exhibiting violent and irrational behavior. You're exhibiting violent and irrational behavior Similar to this person. He was on the center line stripe when he probably noticed the car and then he actually turned and looked at the car, coming straight at it, which is totally irrational. But that's what some of these drugs do to you is. You don't understand why they make some of the moves they do. Um, you know, we've heard from different people like you know. Well, you know, you know this is persons, obviously a long time habitual drug user. This guy actually had a pretty long police record, so the police officers knew him by sight when, when they arrived at the
Speaker 2:scene. Um, which is what it is. Um, but the driver, she's just a middle-aged a woman going home to her family. She's traumatized, you know, imagine, you know, killing somebody, um, and it's just you and the other person on the road. So we're trying to stop it from both sides, from the person that's the victim, but also from the other person that's kind of an innocent party, didn't do anything wrong other than they just didn't perceive something ahead in time to react to it. She doesn't seem to be distracted, she was just listening to music and going home and unfortunately that's the results we're kind of dealing with and it seems to be getting worse. So we start looking at where all our auto ped crashes, since that seems to be the majority of our increase in fatals, and so we do heat maps, a lot of you probably familiar with, where you just map, you know your locations and look at where you're getting about the most occurrences of
Speaker 2:them. This section up here is a little over a one mile section of Colfax which is a state highway and for a while there, in some of our years, 90% of our ped fatals were happening in that same section of road. There's a lot of reasons for that, we'll get into it, but this is probably the lower socioeconomic area of our city. It's part of it. But there's some other issues as well. You know, when you look at this section of road it looks like a lot of other cities. You know grid, street system. You know you got the infamous Casa Bonita restaurant on this section, walmart's, home Depot's, you know, nothing really stands out too much as something different compared to a lot of our other highways in the city. You start digging into the records even
Speaker 2:more. One trend we started to notice is there's a lot of cheap motels on this section of highway and there's also a lot of liquor and wine stores and bars. And we started to notice that a lot of the crashes were happening going between these two, between liquor stores or bars and these motels. And so we started focusing on that with several ways to try to figure out. You know, are there ways these impaired pedestrians are just like wildlife, you know? And Matt kind of took that comment and really ran with it and we're like that's true. Can't we essentially identify somebody in the roadway that shouldn't be there and turn off some signs to warn drivers that hey, there's something in the road ahead? You know, slow down, watch
Speaker 2:out. And that's kind of how this project was born, that we're working on, at least on the intelligent transportation side, is to detect pedestrians in the roadway and warn drivers. Right now we're looking at signs and lights. We already push a lot of our signal timing down to some of the newer cars with connected vehicles, and so obviously that'll become a safety message there, but we just don't have enough percentage of cars in the fleet that can really take advantage of that. But that's something that will be probably a following step. So one of the things we're looking at doing is we actually have a physical project we're doing improvements on as well, understanding that we have all these pedestrians that are jaywalking into the roadway and getting hit, either going to or coming from their origin and destination, and so we're redesigning the section of highway to essentially separate the vehicles and the pedestrians as much as we
Speaker 2:can. Right now the sidewalks are right next to the curb. If there is a sidewalk, sometimes it's just asphalt into a parking lot, and so we're doing what we call a detached sidewalk, so there's a six to eight foot space. Usually, a lot of places call them tree lawns. We're not going to put trees in there, because we don't want anything to hide behind a tree and make them less visible, but it gives an area where if somebody's in there, you know they're not where they're supposed to be and so hopefully another visual clue to the driver that that person might enter the road, to the driver that that person might enter the
Speaker 2:road. In the median we're actually putting down some fencing so you can't cross or make the median about two and a half feet tall, so it's more of a barrier so that the pedestrians can't just cross the road wherever they want. We're also adding a bunch of what we call these Z crossings at about every quarter mile and so we're trying to funnel the pedestrians to these Z crossings. They go, you walk down the median. So as you're walking down the median, you're looking at the cars coming at you and the cars coming at you can also see if you're headed their way, even if they don't wait for the signal. But it gets people to be in the right orientation where they can see and it's predictable if they're going to enter the roadway or not, and also if you're on the side street, since there's a median in the middle here where you can't cross. These are all right turns so you're not in conflict with the crosswalk, so it's just safer for everybody all
Speaker 2:together. We're also adding more lighting. Obviously, most of our crashes are at night, so we're adding pedestrian scale lighting on the outside to really illuminate the pedestrians on the outside. We're doing the traditional roadway lighting on taller mast arms with brighter lights in the median, with the idea that if we can get light on the pedestrians in two different directions, they're going to be more visible, even if they are wearing darker clothes and things that aren't helping make them stand out. We've also started timing this corridor with our traffic signals to control speed and so and so yes, we actually do call the side street up, and so if you're doing more than two or three miles an hour over the posted speed limit, all you're going to do is arrive at the next signal early and get stopped, or at least have to slow down for it, where, if you basically drive the post and speed limit, you'll make it through all eight signals without having to stop at all. And so, testing that out, the drivers get tuned in pretty quick. We use countdown pedestrian heads and so, as you're driving down the road, you can see, you know there's still five seconds in the flashing, don't walk and so people automatically slow down and start behaving the way we'd like them to. So that's been a big improvement already. But this physical construction, you know, takes years to build and so that's the
Speaker 2:downside. So we start looking at our intelligent transportation detection system. It really comes down to reliability for us. You know, can we predict pedestrians that are entering the road? Originally we went in thinking we wanted to to predict, and so we wanted to look at their speed and their direction, so essentially their vectors, you know to to determine if, if they're going to enter the road or if they're just running alongside of it and we don't have to worry about them. The fallbacks is not a lot of. The technologies out there could do that predictive based on vectors and you know speed and direction Well, just like we do with most vehicles or pedestrians is. You know if they're in a zone, can we detect them. And so basically we started out you know if they step a foot into the roadway, can we detect them. You know as they hit that Um. And then once we have those, those tree lawns, um, without trees, but once we have that open space, it's just grass landscaping. That'll be part of the detection system as well. So once somebody enters that, it can set off the system and then, knowing from traffic signals and other detection systems, you know, is video reliable enough in all weather
Speaker 2:conditions? Lidar has been making a big push into the arena, both for detection and obviously on automated vehicles and figuring out what's going on around them, and so that was another thought. So then we started looking at what is our desire for accuracy. You know, if you have a system that's always going off saying, hey, there's something in a road and you keep driving through and you never see anything, pretty soon you stop believing the system and so we're looking for that higher. And we started out thinking we wanted at least 95%. Obviously, um, better, it'd be better. Um, we'd also didn't want to miss anybody. Obviously, missing people is probably worse than having it flash when nobody's there. So missed events, um are really detrimental. Um, and then false
Speaker 2:positives. We talked about um, late detection is, you know, can we detect somebody? But how long does it take to detect them? And with some of these systems, you know that was a concern, you know, because if we're only talking about two and a half seconds for somebody to react unexpectedly and crash, I mean you're really trying to get that down to a. You know, a second half, a second detection so we can get them out. And obviously we're looking at different weather light conditions, sunny, rain, snow, nighttime. So we put out a call for partners that were interested. Initially Alistair LIDAR, velodyne, lidar Dirk, the Verdea Analytic Company and Bosch responded and we kind of went through the design and what we were looking for and started figuring out how many detectors they wanted and how we're going to wire the system. And so we picked a one block section of Colfax from Ingalls to Harlan and instrumented it facts from Ingalls to Harlan and instrumented it. I will say Bosch never got past the design phase into the actual installation phase. They just kind of faded away and so I don't have good numbers on their system. But I think once they got into what we're looking for I think they might've realized their system wasn't at that point yet. I will add that Leopard cameras and Sony, after they were done with San Jose we were partway through this evaluation they approached us and so we added a different section of Colfax down by Teller that you'll see here in a minute for their system to test
Speaker 2:it. So traditionally you know your Li, your lidar unit, you're going to have an edge appliance to help process the lidar information and and react to it. And you know, and for our case, if we're just trying to turn on a contact closure, that turns on a sign to do that process. So you end up with some detectors, some boxes on poles for the edge appliance and then obviously we got to get power and communication to them. It's pretty standard when you get to the, the sony it was a kind of all built in to the, to the one unit, so installation was a little easier. You'll see here in a little bit. So this is kind of what our instrumented area looks like. On the. You'll see here in a little bit. So this is kind of what our instrumented area looks like. On the left you'll see a couple different of the LiDAR detectors mounted on the same signal pole. As you get down you have a lot of the edge devices in these NEMA boxes we call them. Put on the side A little radio to get the signals back to our main controller to do stuff and test camera systems, whether it be a dome camera or a fisheye camera or just a normal forward-looking view camera. We left all this up to the vendors. We told them what the problem was and the area we wanted to detect and they're the ones that came up with how many devices where the devices were based on the poles that we had access to in power, to figure out how they thought their system would work best. So we kind of led it all up to them and so you'll see here in a minute, well, a couple
Speaker 2:slides. But when we did a study, we did an uncontrolled study and a controlled study. So the controlled study, you can see, we picked an hour or so and we basically had staff go out and do certain maneuvers. We'd walk straight across the road, we'd walk down the road and then cross, we'd have two pedestrians and we'd split and go different directions, we'd jog, we'd do different things that we wanted to see how the the systems reacted to. And then the uncontrolled study is just letting the existing situation happen. Um, you know we picked these studies because these were the areas we're having the most crashes and so there's things every night that would happen and we'd come in in the morning and go into the dashboard and we'd see these events happening. And you know there could be 10 to 15 events every single
Speaker 2:night. It was really a shock to most of us here. We knew we had problems. We knew we were getting crashes out there, but we had no idea how many near misses were happening every single night. That's been probably the biggest thing. Where we were deficient was to realize, you know, crashes are a small fraction of what happens out there, and how many near misses there are before an actual crash happened was was was shocking to most of us that have doing this a long
Speaker 2:time. This is where I talked about we did different scenarios, you know, having a group cross the roadways, then taking different paths, doing loops, everything we thought we could do to kind of mimic some of the actions we saw in the real world, but also things we wanted to test the detection system on how they could follow that person or determine that they're still in the roadway or not. One of the things we had talked about is what if an object laid there, you know, went into the roadway and just laid down or fell down, passed out, whatever what it says, and all the systems would detect it for a while, but at some point, when there's no movement, it's going to, you know, become background and you're not going to be able to be detected, warn people around and then, if they went off the backside of the detection. Hopefully people already knew about it and had stopped and aren't just driving around somebody who's in the roadway. Sometimes we have too much faith in society. This is kind of what we talked about with controlled experiments and uncontrolled. So we obviously got a lot more data in the uncontrolled, because the systems were out there for months at a time and we definitely would look at when we had rain, when we had
Speaker 2:snow. Parts of the year there's a lot of glare as you're looking east when the sun's rising so we wanted to see how they did in the glare situations and so we just tried to cover most of these. So this is out of the DERC. But it started to give us heat maps where people were crossing, and so you know when it's a crosswalk. That's pretty understandable. You know, this was a little different, where they're kind of going up the street a little bit, but then there's some areas where we didn't expect them to be crossing at all, and so that's when we started looking at these origin destinations and starting to figure out why are people crossing there instead of, you know, walking a hundred feet down the street to a crosswalk that's signalized. And then when we got into the project development. You know how do we do things to prevent people from crossing there that we don't want them
Speaker 2:to. This is another thing where you can do near-miss evaluations, and so post-encroachment time is what they call PET there at the bottom. But basically you set a slider. We usually set it at about two seconds. If two objects occupy the same space within two seconds of each other, we basically call that a near miss and so you can see where all these near misses were happening. On the left side you can see on their dashboard where you can filter it down to certain movements if you wanted to. If you're only interested in, you know, east-west cars or east-west pedestrians, you can filter it down to all that you can see at the top. It kind of also gives you a tally of, you know, like there's one pedestrian versus a bus, but 11 bike versus cars in the timeframe we're looking at, but 11 bike versus cars in the timeframe we're looking at. And so a lot of data and that's one of the key things is, you know, is make the data actionable, make it easy for people that are using it to figure out what they need to react to and how to use that data. This is the Sony Leopard site down at Teller Put a camera up on the backside of a mast arm and basically we were able to detect about 290 feet with that
Speaker 2:camera. You'll see there's a little clip-off where, because of the camera angle, we didn't quite get all the way towards the stop bar in the one corner, but as an evaluation it was pretty good. You will notice that I'm sure all the vendors like you know is this the best situation or the worst situation? It was noted during part of this evaluation that the street light wasn't working so it was darker than normal. Unfortunately, those of you that have driven around on a lot of highways and arterials, that's not an unusual situation. We have about 7500 street lights in Lakewood and we do analysis every so often. There's always about 5% that are out at any one time. So having a system that can work even with a street light that's not working is pretty
Speaker 2:good. When you get into the detection you know they have the same thing where they can show where pedestrians are crossing and those usually line up with driveways or doors to businesses and people. If there's not a lot of risk in their mind, you know we'll just cross instead of going down to where we professionally like to have them cross. Same thing with heat maps, kind of showing where you got pads crossing the road but how they're crossing the road as well, and then obviously you can have you know where's the worst problem that type of stuff, and then obviously you can have you know where's the worst problem, that type of stuff. An interesting side note here is when we did the controlled study here I think there was six didn't just want to be walking around out in the roadway at 530 in the morning, but as long as you know, it was light traffic that we just had them stay off to the side, didn't have them block, block the road, but off in this right corner there was one of those, I think it was a day labor position and there must've been 20 guys all lined up there waiting and they're just watching all these crazy. You know six people walking all these weird ways across the street back and forth and there's police officers sitting in the corner of the parking lot watching them. I'm sure, um, I would have loved to be on their their messages to figure out you know what's going on and what they were
Speaker 2:thinking. Um, a couple of things we learned, um on the video side. Um, obviously your frames, how often your frame rates are being pulled, is going to have a big effect on how quick you can detect something. And so we've had phase one in all our analysis. And then we allowed people to go to phase two, which was make tweaks on their system, you know, retrain the AI if that's what they wanted to do. And then Alistair, for instance, put in their latest LIDAR detectors and got rid of the earlier ones. Uh, dirk went with a whole different camera, they went with a fisheye camera to to try to improve their detection systems. And so it's kind of learning on both sides what was working, what wasn't, and so you can see. You know, you know five frames per second, the person's pretty much going to be in the road before they're detected, possibly at 10 frames, it depends on how the movement was. But you start getting to that 15 frames, 30 frames, you reliably detect them when they hit the road, and this changes over time, obviously, as your system and your software gets better as
Speaker 2:well. So then, kind of a summary of our detection systems. You know our phase ones. You know Velodyne, ouster, dirk, leopard was split into controlled and uncontrolled. You know, you can see. You know there's a lot of missed calls with the LIDARs that we weren't very positive about. But then we changed things up and went to their phase twos and all those numbers got really good. Um, but I will point out, um in the in the first part. You know Sony's edge AI had late detection, so there are more than five feet into the road before they were detected. With some retraining of the AI and changing frame rates they basically were able to hit everybody. And so that's kind of the system we're looking to deploy first, most likely Just because you know it's hard to beat. You know no misses, no false positives, hitting everybody before they are five feet or more into the road Pretty good
Speaker 2:results. But I think the bigger thing here is, even though we knew what we wanted at the start and everybody knew that, they tried first and it's kind of an iterative process. You got to let it, test it, try it and be willing to make changes, whether that's new detectors, new software, retraining the AI, those types of things. An interesting thing with the Leopard camera was initially in phase one it was detecting things further away better than it was closer to the camera, and so we think it was just the size of the objects were so much bigger close that the AI wasn't picking up that that was still a person. It's just, you know, twice the size of somebody at 250 feet away. So things always get better, you know, and don't be afraid, it is kind of iterative. It's one of the big things we pulled out of
Speaker 2:this. You know, technology is changing so fast. You know, retraining the AI can lead to hugely different results. One of the problems we have is we got about a $30 million project to redo the medians, put in street lighting, put in sidewalks. You know that's a three year from now. You know we found out even in six months of technology is vastly different than what it was six months ago. And so when you look at these systems that are going to be deployed on top of something physical, you know we realize that technology is going to be vastly different. So we're probably going to have to relook at it. But so what we're looking at is we have some systems where we have similar problems or locations, and so we're looking to start deploying this technology right now, in the next six months, at some of those locations that we can have some benefit and we're ready to deploy. So, going back to requests, you know we'd love to get to the point where we can predict somebody entering a protected area, instead of just putting in a detection zone and saying, hey, once something's here, tell me. It's like we should be able to be smart enough to say hey. It's like we should be able to be smart enough to say, hey, this guy's running at 20 feet per second and he's running straight for the road. We should be able to detect that he's going to enter it before he actually gets to that, five to six feet from the edge of the road
Speaker 2:Data. Um, for me, as a as a as a human that's been doing this a long time, I find myself migrating to video. I can see a lot more context and and understand it a lot easier. The same events were in the LIDAR systems and you could follow the dots and and kind of see the same pathway, but you didn't get to see any interaction and what the person was doing. Were they bent over? Were they in dark clothes? You just had dots, and so to me, I found that I find myself migrating more to the video systems. When we're hearing about something that happened the previous night. We want to look it up. We just go to the video instead of the dots, and I think maybe that'll change over time. But I think for the people that are working on the LIDAR side you know that's something they have to work with as well is is how to make that data more understandable and usable for for those humans.
Speaker 2:Hey, y'all, drive, walk or bike. We talked those humans. Hey, y'all, drive, walk or bike. We talked about it. Hey, take some defensive driving courses. Pay attention out there. You know, if you slow down, you're you're probably going to save a life and everybody gets home is what we're trying to get to. So with that, I think we're ready for questions. Here's my contact information if you want it.
Speaker 3:Awesome. But that was a deep dive, I mean, I think right out of the gate you know it's a serious topic and there's no other word than the kind of tragedy for what you guys are dealing with. But the way that you kind of walked through this interdisciplinary problem and came to some good common sense as one conclusion, I think there's a big rabbit hole to go down, uh, especially with some of the questions that people have asked about the AI. I mean, pete, I don't know about you, but the big takeaway, the kind of aha moment for me when which showed the phase one to phase two, and I wonder if there's any hardware changes in there to get the higher FPS, but from software iteration alone I think that's the name of the game how can you improve that iteration time to get to that better result faster?
Speaker 2:yeah and um, they actually didn't do any hardware changes, they just changed the frame rate that they were capturing okay back later, once we realized the issues. From phase one, they just changed the way they wanted to save. Obviously they were saving it locally and so if you were, you know real time, we had communication set up and it was a permanent installation. It'd be a little bit faster. But yeah, they just changed the frame rate and then retested it.
Speaker 1:Gotcha Interesting faster um, but yeah, they just changed the frame rate and then retested it. Gotcha interesting. Yeah, we had a question from chris that came in on that related to that, um, the timeline between phase one and phase two to train the ai to become more accurate.
Speaker 2:I assume when they increased the frame rate they also went back and tweaked the ai models too yeah, I would say, from what I remembered it was on the on this, the leopard camera system, we gave them usually about two weeks where they pulled data and looked at it and analyzed it. You know, it would usually be that two to three week range before they come back with a new try um on the original systems, the lidars and DERC, we tested it, you know non-controlled and controlled kind of gave them the results and then they all went back and looked at it and so by the time we did our analysis and gave them the results back, it wasn't near as active. So there was probably a three-month window there that we had tested things, processed it and gave information back. Then they made some changes, whether it was new detectors or different camera and changing their software on the back side.
Speaker 1:I think to me, one of the takeaways I had here too, by the way you were explaining it was the importance of you know we talk about ML Ops and you know DevOps and manageability and programmability of these devices. So whenever you have an AGI deployment, you know the ability to go out there and touch it in the field remotely and update it, take the data back in is just super critical and you know that idea of now we're talking about updating AI models as well as other parts of the system is kind of a crucial, crucial part of the solution for sure. I had a quick question on cost. I think you know what are these kind of. Did you have a target on like deployment costs per camera solution? Was there a big difference between the LIDAR, like the two piece solution versus the integrated? I mean, what was the difference there? Um, roughly speaking the biggest cost.
Speaker 2:I'm gonna go back. I don't, I may have, I may have hidden that slide okay um, but originally it was interesting, ouster and velodyne combined companies halfway through our analysis, but previously they had totally different views. So Ouster went out there with eight detectors for that one block section where Velodyne thought they could do it with four.
Speaker 2:And then when they upgraded their detector in the second phase they did it with two, upgraded their detector in the second phase they did it with two, and so the detectors are getting so much better, more powerful, more accurate that the costs are really coming down. But yeah, and that first deployment, you know we were thinking it was going to be about $5,000 ish. I think the other interesting thing is the hardware is one component. Most of the systems out there.
Speaker 2:You know there's there's an ongoing ai type processing charge for a lot of them right and so you end up with more of a subscription software as a service, which, I'll be the first honest, you know municipal. A lot of us are just getting over that hurdle where we're used to buying something and owning everything and having it, you know, and software as a service is something in the last decade that most of us are becoming more accustomed to, and so, like the Dirk system, I think it's about $10,000 a year to run their system all year long. Other systems we're looking at sometimes it's more on the storage.
Speaker 3:So, if you want, all the data stored.
Speaker 2:this is what it's going to cost you. If you want it condensed or things, they have a better cost. So it goes back to that iterative what do you really want and what's the cost associated with it, and can you guys negotiate to something?
Speaker 1:now I guess there's deployment costs too, I mean the. I assume these are all like power over ethernet or some kind of wired solutions, right?
Speaker 2:um, all these were um. We either pulled power off the photo cells of the streetlights or we had a separate wire on the traffic signals that was dedicated to power. When you have the edge devices, we're bringing power into that box and then they could power it. And then the camera usually was POE, the Leopard was power over ethernet, but we still had a box to capture the data locally, since we didn't pass it on through the internet so they could access it remotely.
Speaker 1:There you go. I see we had a couple more questions, Davis. Maybe we can get to those.
Speaker 3:Yeah, let's bring those up. If you want to highlight the next one that came in, it's along the AI models. So are you using or testing AI-driven gate detection to detect gate markers of harm or regularity and if so, how? So it's one from John Rizzo.
Speaker 2:You have to help me. What's gate detection I?
Speaker 3:believe it's the unique way in which someone walks or holds their weight, so it's like a marker of their hip, the way they move, and it's supposed to have a unique signature of different behaviors or personality.
Speaker 2:Yeah.
Speaker 3:So it sounds like no.
Speaker 2:We didn't get that far. We were mostly looking at their paths and then in our systems. Really it was just a zone. Once they stepped a foot into it, it was detected.
Speaker 3:It was matter here and matter here.
Speaker 2:yeah, that's what we're trying to get to with the vectors you know, if you can tell, like you say, whether it's jogging, running whatever. That's where we'd like to get to, but the systems just weren't there yet that we were testing.
Speaker 1:Yeah. Building on the model evolution there's some yeah yeah, cool those gateways too at some point. So, yes, there's no cloud involved. Uh, we had another question from chris here on, yeah, about connectivity. So, uh, you know, how is the data being transmitted? Is it through the city network or you're doing like 5g or what's your backhaul communications?
Speaker 2:so these were a test. Um, we just set up atTE, had them, or the company set up their own LTE because they wanted it. We do have license plate readers and other devices out on our network that we go to the problem of VLAN and them through our network to get up to the internet, to get it back to the cloud. So long-term, when we actually do a solution, solution that's what we'll do but, for this evaluations.
Speaker 2:It was just so much easier for the companies to to put a cell boat about there to get their data yeah, well, the nice thing is getting all the processing done on the edge.
Speaker 1:Like that you're not really using that much data, right, you're not? I mean, I guess you have to send some data back here and there, certainly the metadata maybe you have to send back. I don't know, it would be like a legal issue. You have to send back capture video and then send it back to the cloud in case there's an incident or something.
Speaker 2:The pii yeah, yeah, so I think we could escape most of that because we have over 400 uh cameras around city, so we already had these areas covered with our normal camera. That goes into a milestone system, that's recorded, and so that's kind of how we did. Some ground truth verification is we have what the systems would give us. And then we had a recorded video, and then Vijay got to sit down in a room and watch a lot of video. Cool hey we're coming up on our hour and Whit.
Speaker 1:I really appreciate the time. I think everyone got a good education today on the challenges you guys are facing and it's really impressive the kind of data-driven approach they're using to solve some of these issues. So, again, just really appreciate the time. Folks can follow up with us if you have more questions, but it's been great.
Speaker 3:Yeah, thank you, ed. Great story. Thanks for being here today.
Speaker 1:Thank you.