The Bossy Bees

Champion inclusivity: A look inside the insidious world of AI with Albert Myles

July 21, 2021 Albert Myles Season 2 Episode 5
The Bossy Bees
Champion inclusivity: A look inside the insidious world of AI with Albert Myles
Chapters
The Bossy Bees
Champion inclusivity: A look inside the insidious world of AI with Albert Myles
Jul 21, 2021 Season 2 Episode 5
Albert Myles

Albert Myles is a Knowledge Program Manager in customer content services. His passion and curiosity in technology have led him to explore Artificial Intelligence (AI). Albert is eager to expand the horizons of his own understanding. His work depends on sharing knowledge driven by data! 

For Albert, as a knowledge Program Manager, the current path of AI has raised concerns. There are large gaps in data sources this AI depends on. With so much incomplete data, applied AI can have deadly consequences. Encouraged by the potential, Albert explains there is danger lurking if we continue on the path of unchecked AI development:

  • The inevitability of technological unemployment
  • The harm of homogenous creators: Who controls the creation of AI?  
  • Are the creators of AI leading us toward a path of mutually assured destruction?
  • Let's stop unthinking! How do we turn these growing pains into wins?

Albert brings compelling evidence and experience to understand Artificial Intelligence. He advocates for AI when developers are creating a good tool. Myles says, "AI can do amazing things we need to find the balance before we let it go."

Support the show (https://www.patreon.com/thebossybees)

Show Notes Transcript

Albert Myles is a Knowledge Program Manager in customer content services. His passion and curiosity in technology have led him to explore Artificial Intelligence (AI). Albert is eager to expand the horizons of his own understanding. His work depends on sharing knowledge driven by data! 

For Albert, as a knowledge Program Manager, the current path of AI has raised concerns. There are large gaps in data sources this AI depends on. With so much incomplete data, applied AI can have deadly consequences. Encouraged by the potential, Albert explains there is danger lurking if we continue on the path of unchecked AI development:

  • The inevitability of technological unemployment
  • The harm of homogenous creators: Who controls the creation of AI?  
  • Are the creators of AI leading us toward a path of mutually assured destruction?
  • Let's stop unthinking! How do we turn these growing pains into wins?

Albert brings compelling evidence and experience to understand Artificial Intelligence. He advocates for AI when developers are creating a good tool. Myles says, "AI can do amazing things we need to find the balance before we let it go."

Support the show (https://www.patreon.com/thebossybees)

Stacy Whitenight:

Thank you for joining us the bossy bees. you're grateful to have you here with us today for our fifth episode of The bossy bees podcast on our second season. I'm your host, Stacy Whitenight. I'm sitting down with Albert miles today to talk about artificial intelligence, or AI. We're excited for all the amazing capabilities this technology will bring. But we're talking about some of the insidious ways in which it can be applied. As there's a lack of diversity and inclusion in data. Don't forget to check out the bossy bees on Patreon for exclusive content on this podcast. Alright, let's get started.

Albert Myles:

Want me to go ahead?

Stacy Whitenight:

Yeah!

Albert Myles:

let's go for it.

Stacy Whitenight:

Yeah, let's do this.

Albert Myles:

Hi, Stacy. My name is Albert Myles. And I am what they call a knowledge program manager in customer content services for a large tech company, located in RTP. And that's a fancy way of saying that I am responsible for ensuring that the knowledge that's captured in support and in the development and in side of customer content is transferred to other areas effectively and efficiently. At the end of the day, I tell people, I try to help our company, learn what it already knows. And I try to help us organize what we already know. And then I help us try to distribute what all everything that we know. And I'm doing that through a whole bunch of different projects and programs and stuff like that. And it's a very, very, very new program, but I'm having fun getting it launched.

Stacy Whitenight:

And that's where we started together, and you've taken it miles and miles and miles away from where it started. And you are, I think, you know, I really dislike you putting that title on yourself, because you do so much more than, like your, your knowledge is far beyond that. And it does come together. In your experience, too. It really does come together nicely. In your job, you know, but I think that the reason you're, you know, program has gone so far is because you bring so much experience like what we're talking about today, like you, you have such an affinity and inclination for technology that it brings a lot to the table. And we're talking about artificial intelligence or AI today. And then also married to something that you and I are both pretty passionate about, which is diversity, inclusion, Justice type of stuff. And that's where some of our listeners know where we're going today with this. But you are the expert. listener.

Albert Myles:

I wouldn't say expert as much as I'm an enthusiast How's that? I actually started taking an AI certification. IBM offers this thing through Coursera. And it's basically it's called applied AI. And I got interested in it because it's a big deal. And if you look at any kind of List of jobs are going to be important. In 10 years, AI is always there. And AI is embedded in everything we do. Anywhere from your voice assistant on your phone, to your car uses AI every car uses AI nowadays. I mean, my Carl said Darren, it'll if it sees me, not fat stopping fast enough, it'll say it was blowing up on the screen saying stop, stop, stop. And if I still don't stop, it'll break for me. That's all artificial intelligence. So my concern has become that artificial intelligence is made by a certain group of folks. And because because the tech industry is basically mostly white men. Yeah, it's very. And that's just because of historical reasons. Right? You know, 30 years ago, you'd never saw women in the tech field. You never saw people of color in the tech field. It because it just never happened because it was seen as a wound. This is this is important work. And this has to be done by people to have these long lines of experience in education. So and that's carried on. And so we're just now starting to see where people of color and women are starting and other intersexual intersectionalities are starting to get into the pipeline. But the problem is, is that the developments are happening now. And when you're creating AI systems, you have to do it. It's not just the algorithms that you have to worry about. It's the training and so when It's being trained by all white men, basically, it's going to show their biases, and it's going to show what they know. And it's going to show their worldview. And then you end up with stuff like, when you search for. And when you do a photo, reverse search in Google, of a black person, don't, you know, it'll say to gorilla, if you, if you go to, if you go and read the news of the past three or four years, you know, HP had a problem with their desktop computers, where you couldn't log in, if you were, if you were too dark, it wouldn't recognize. Because,

Stacy Whitenight:

you know, and it's interesting, we were talking earlier, you know, I started reading invisible women, and that is about the data gaps with gender. I'm not done. So I don't know that it really goes into any further intersectionality. But just out the gate, we're looking at tremendous, like, many, like hundreds of years of a data gap just in between men and women, or different genders, you know, we're not, you know, the world is far more open to acknowledging that we have more than just two genders. So imagine how much further that goes and, and to be flat out honest, it really is just, it's men and other than the other. So right out the gate, you know, we're looking at a huge, huge, huge gap and data just there. And you're talking about, you know, googling it when you Google, gorilla, and people of color, darker skin, people are showing up. And then it made me think of, you know, how the Invisible Woman, that book opens up talking about the date, how the data gap is a life and death matter, like, you think like that, you know, you kind of awkwardly laugh at like, Oh, yeah, that's not great. But now we're, this is getting into life and death. Like, because of these data gaps, we're looking at only white men, if we're looking at only white men and thinking about heart disease. My signs are different than my spouse who is a white man, my signs of getting heart disease are different, the path to get there is different. And then you dig a little bit further. Well, Albert, yours is very different than what John's might be. And so, you know, while you know, we're talking about AI, and if we're, if we're, that's just the data that our doctors are looking at, right? Like, they look at a sheet of paper, and they're like, okay, you fit into this category. But what happens when that's the only data we're putting into our AI, we're gonna miss a lot of people.

Albert Myles:

And that's what's happening, right. So I want to be clear before your listeners jump out of that is it's the reverse image lookup. So it's not that if you type in gorilla into Google, black people show up, it's if you show Google picture of a black person, then it will show it with name tag it as gorilla. So, but I think they fixed it by now. But still, it was there. Well, um, so the problem so the problem with having training data that simply training data is the same thing. As you know, the data you were just referring to, is, again, like you just said, designed to have a heart attack, you know, those types of things. And so let's take the Apple Watch, for example, Apple is doing a really good job of trying to make sure that they cover everyone. But it's to the point now we're, it's it's reading your gait of your walk, it's looking at your it's gathering your heart rate, your oxygen levels, all of these types of things. And people are, you know, people are kind of relying on applewatch to kind of warn them if something's going wrong. You know, they I mean, for better or for worse, I'm not saying that's a good thing or a bad thing. I'm not gonna pass judgment there. Right. It's just a call to another tool, right? But even the app, the famous Apple watch it at the beginning, you know, it was reading it was having a hard time reading people that had darker skin. And that's because when they were creating it, how many people have darker skin where they testing it on?

Stacy Whitenight:

Yeah,

Albert Myles:

right. If you Google, I don't know if I remember there was a point where if you googled wedding dress,

Stacy Whitenight:

yeah... you had me do that!

Albert Myles:

You don't...

Stacy Whitenight:

None of these people don't look like me!

Albert Myles:

They definitely look like my wife!

Stacy Whitenight:

I mean, that's what I said. I said, Oh, it took me about five or six page scrolls. To get to Someone that had darker skin. And that was a light skinned black woman. So looked a little bit more like me. But if you're Asian, like I didn't see, you know, many Asian women in that search, and certainly not, you know, if you're looking for more traditional, like, if I were looking for a traditional Dominican wedding dress, like, that's not even coming up for me, or, you know, I mean, there are all types of cultures with different traditions and their wedding attire. And let's, let's not even touch on the fact that there wasn't anything specific for LGBTQ people getting married, that they there was no alternatives. It was very homogenous.

Albert Myles:

And then if you if you talk to Google, you talk to the folks that run the algorithms, they're, they're saying this of, Hey, we base our things on what people find popular, or what people find useful. So that was the whole idea of PageRank is, you know, he would say, they would look at, you know, what you're clicking on? And then how often do you click back from that, and that kind of helps rank the content. But again, if you're feeding people, what they should be eating what they should? Yeah, exactly. You know, if you're telling them what they should be eating, of course, they're going to, like you said earlier, that you're going to stick with the first page, you had to dig several pages down, most people are gonna stick with the first page of results. So if Google's feeding you that it's a self reinforcing loop, right, you know, then, of course, the first page is going to be the most popular because it's the first page. So

Stacy Whitenight:

So I guess we run into that problem looking at other things. Like, if you're, you know, some people may be like, okay, who cares? It's just a wedding dress, but what about when someone's applying for a job?

Albert Myles:

You know, there are companies right now that will read the video don't actually have dirt candidates, record video interviews, and they will ask you to answer a series of questions, and they will look at your body language, look at how you answer the question, listen to what you're saying, and then they will score you, I'd fail. Right, we'll see that's me is I have a hard time hiding my emotions, I wear my emotions on my sleeve. So I would be done. So but that's AI, and if they're training it with mostly an homogenous group of people, and from, you know, let's say that, remember how, how television, they used to say that the perfect accent came from the Midwest. Back in that was, they used to say that. So it makes me wonder if the perfect person comes from Silicon Valley, then you're going to have their, their sensibilities, you know, these are their values and stuff like that, or even the east coast in the case of some of the East Coast tech. So

Stacy Whitenight:

he said something pretty interesting right there, which is important to, to really take a look at, you know, you and I are based out of RTP. And certainly, you know, working for a software company, and we have a lot of people out here, Microsoft, Lenovo, SAS, you know, we've got a lot of, you know, Red Hat, IBM, a lot out here. But we're not really the hub. So where's this technology mostly being created?

Albert Myles:

It would have to be the Bay Area would probably be the first area and I would say maybe Boston area would be second place. What about other countries? So my understanding is that overall, Europe's not doing much in the field. They're doing stuff, but they're not leading. Now. I could be completely wrong. But the book I put in our notes, the big nine, she basically relegated to Europe to the side, because none other companies are these huge companies that are doing a lot of work in it. They've got some government research going on. But the other big place is China. Okay. So now wrap your head around that, because China's going to be extraordinarily homogenous, right. So they're building their API's around. Literally, you've got a very narrow grip. You know, I mean, yeah, there's a lot of variances there. But I mean, it's nowhere close as diverse as the United States is,

Stacy Whitenight:

and they're a little bit more. I mean, they're pretty open and their discrimination practices. I mean, you look at like what they've been doing to the weekers. And they're not going to be included at all. I mean, the US has their own issues as well, like generally just non white people. But I also think, you know, almost the equivalent of Is our own indigenous people that live here that are like, the United States is very open in terms of discriminating against. So that's not good. I mean, we're leaving a lot of people, we're excluding a lot of people.

Albert Myles:

That's the scary thing. And so you've got like these two, and we're basically evenly yoked. I mean, they've got just as many as much money going into their AI research. Plus, they've got their government pushing it hard. Our governments basically staying out of most of it. I mean, you've got DARPA doing some work and some of the other God stuff, but it's really Google, IBM, Microsoft, Apple, those companies are really doing most of the work. And so what is my problem with it is it's like you said earlier, so what? Okay, so it's fine when you're talking about wedding dresses, and it's fine when your computer won't log you in to your, because of your camera and stuff like that, you know, I made a comment about, you know, going into the bathroom, and people you know, feel sore, like, it's like, wait, wait, wait doesn't work, but then a white person walks in front of you, and it's fine. And then you walk behind him? And there's like, wait, wait, wait, it doesn't work. So those things are trivial. Okay, so the next stage up is facial recognition on police surveillance cameras, you know, what's the what's the rejection rate? Or what's the error rate? When they're Miss identifying folks. And we, we have studies that will tell you that, yes, people of color are Miss identified a lot more. And again, it's all back to the training set. It's not the algorithms there. I mean, people do actual code, write the actual code, they're just doing what they're told. And so it's, it's how you tell it, what's right and what's wrong. And so, so it all wraps back around to, you can't really blame john doe sitting in a lab, feeding the stuff, because he's only feeding what he knows, right? It all comes down to, you know, the higher ups at IBM and the higher ups in Apple and the higher ups and Amazon, they have to make conscious decisions on we want to do the right thing. And I mean, we've seen in the case with the big search company, Google, you know, they're having problems with diversity, big problems. I mean, even when they hire people to try to fix their problems, they mess it up. I don't want to see that. I'm hoping that's not happening and other tech companies were thought it is. But, you know, we have to take it seriously. And because not only for safety reasons, you know, so cars, we were just talking about cars. If you have if you have cars that are that automatically stop if someone walks in front of them. What if someone is invisible? Eight second stop? Well, you know, so we're doing a lot more of, you know, outsourcing our outsourcing our responsibilities to these computers, we want to make sure that they're doing the right thing. And then there's the next piece, which is the unemployment. Which is, which is a really weird reinforcing loop. Right? If you think about because one feet in first feet into this loop is that well, how do we get people to how do we get people marginalized people into the loop? They're just now getting to the pipeline, right? And AI is pretty advanced. So we're always going to be chasing that, right? And those are the jobs that are going to be around because Technological Unemployment is basically what happens when, depending on the book 20 to 30 to 40% of the population is unemployed, because there's no work for them to do because it's all been automated. And so

Stacy Whitenight:

We the see this at to grocery store already. You know...

Albert Myles:

oh yeah! McDonald's.

Stacy Whitenight:

Yeah! And I went to a baseball game last night. And there were three registers and one person standing there and I was extremely irritated with that, because one, the the ordering system was not efficient. And so people, you know, especially older people were like, I don't know how this works, or like it's not working for me. And it just put two people out of a job.

Albert Myles:

Well, and I mean, things got really crazy with COVID. Right, right, because that tossed everything up into the air. And so a lot of businesses were sort of like, well, I can't find anyone to work in that. I can't find anything to work. Let me automate this and just see how it works out. And hey, look, it worked. Well, yeah. It kept them afloat maybe right?

Stacy Whitenight:

I think people were looking at quantity over quality. the quality, in my opinion, is not there. And no, my Tesla does not recognize some people when they're walking across the street,

Albert Myles:

Don't tell me that. that's terrifying.

Stacy Whitenight:

And you know what it is only. Like, it wasn't a person of color. It was a, it was a child. So people who are smaller in stature, I was like, uhhhh... that's exactly what I needed to see.

Albert Myles:

So that's really terrifying. If you think about it, that's exactly what you needed to see,

Stacy Whitenight:

right? Or my dog behind my car, when I'm backing up, it's not doesn't really seem to pick up the distance.

Albert Myles:

Oh, man.

Stacy Whitenight:

Yeah.

Albert Myles:

So all of the jobs are kind of starting to be funneled into these high, high touch type of jobs. So that's like, you're not just computer programming, but designing the algorithms for these computer, these AI systems. Anything that has to do with working with people, so psychologists, that's not going away. A lot of the doctor, the medical specialties, those aren't going away anytime soon, hey, you know, lawyers are nervous, because you know, things like legalzoom.com, they're starting to take your jobs, but trial lawyers are going to be around for a while, because there's the human experience. So anything that has the human side of everything, those are going to be still around, but they're going to be heavily heavily assisted by AI, and people that aren't comfortable that they're gonna are going to be the ones to have the problem. Well,

Stacy Whitenight:

I want to back up here a second, because some of the intake that I've experienced over COVID, like with my, like, if my kids are sick, or I'm sick, or something like that, some of that is being automated. And if we're missing data on the end, like if we're not, if we don't have all the data we need on intake, we're not gonna be able to properly like, assess somebody or get them admitted, you know, like, if we're missing data on the front end are not doing a good job, once they get into the office, and the doctor is being provided this information. So, yeah....

Albert Myles:

So, so that, that brings up a good point. And that really wraps back around to what you were talking about at the beginning, was, in general, we're missing data. Yeah. So, you know, you might pre presenting all kinds of factors for a heart attack or stroke or something major like that. But if we haven't collected enough data, for all myriads of reasons, you know, we all know that women tend not to, not to report things. In the same way, like, you know, I know, at least at least in, you know, my family, it's like, you know, the women are less likely to say something because basically as men are wimps, but women are much less less likely to say something that they've been experienced in an issue plus, these things present differently,

Stacy Whitenight:

right? Or it's expressed in a completely different way that like, we're not, it's like, you know, with the COVID vaccination. They didn't even think to look at women's menstrual cycles.

Albert Myles:

Okay, when I read that story, I almost fell over, right?

Stacy Whitenight:

Like, it happened to me. And I was like, What is going on? would have been nice to have known that.

Albert Myles:

So but we need people in that think about that stuff, right? In order to process that stuff. But if we don't have anyone in there that can process that stuff. You know, so

Stacy Whitenight:

data is missing when the data is missing, it's...

Albert Myles:

And let's be clear, everyone listening? AI is nothing without data, right? Without big data. You have to know your numbers and you have to process those numbers and you have to do it correctly. Without that you have no artificial intelligence

Stacy Whitenight:

It's a half story. It's a half story. Leaving people as an other can't be white man plus other. Like everyone else can't be the other like everyone, those intersectionalities matter to drilling way further down than just male, female. It's got to be way more than that.

Albert Myles:

And I want to be and I want to be clear, I'm not sitting here and I'm not like Casting blame or like saying, you know, bad white men shouldn't do that type of thing. It's a fact of life. We just have to correct it.

Stacy Whitenight:

Yeah, I think it's like, unthinking right. It's it's not intentional.

Albert Myles:

right?

Stacy Whitenight:

It's not intentional. I don't think that there's any. I mean, that's, you could argue that, I mean, I think some people would probably argue that it is intentional. But that doesn't get you anywhere, like sitting and looking back at history and trying to understand like, it really, you know, it's about forward progress. And, you know, I think you and I are pretty similar in the fact that like, Hey, we know that this is happening, like, great. I'm not one, you know, as a program manager, I'm not one to sit there, it's good to understand the challenges. That's one thing, but really, what it's about is, how do you make it work for you? Like, how do you turn it into something better? something positive, like a win? And how do you make, you know, the wrong the right, without dwelling on the wrong because sitting there and blaming a group of people who were just not thinking, it's not gonna get you there?

Albert Myles:

And I think that's going to be the case for any group. I mean, it's that, well, it's the whole idea of, of, of unconscious bias anyway, right, is that you don't know that you're biased. And when you're in the middle of it until someone points it out. Man, and to be fair, to be completely fair, it's a very tough thing to come to come to grips with. I mean, I, you know, because like, recognizing my male privilege, and male bias was brutal. You know, it was just like, because I'm sitting here thinking, I don't have any privilege, what are you talking about? And it literally took an hour. So I get it, I completely understand and I empathetic with it, but I mean, maybe we shouldn't, maybe it's, it's time to take off the gloves. You know? Yeah, I agree. I mean, I hate to say it like that. But it's, you you're being nice about it is causing lots of problems. And the deal with technology anyway, is that it's exponential. So you don't notice anything's changing, you don't notice anything changing, you don't notice anything, changed it into literally the next day, the whole world is different. That's how that's how exponential technology works. And that's exactly how everything we went from no one touched internet, no one touch to internet, barely anyone was on the internet. Next thing, you know, everyone was on it. And that's just how the hockey stick works. So the worry is, is that's how AI is gonna work. I mean, you know, we're here, we see it everywhere, but it's not really in front of everyone's mind. Next month, we could turn around and wow, everything just changed, right. And we're mad who gets left behind? Right,

Stacy Whitenight:

and we're missing a lot of data.

Albert Myles:

Right. So, I mean, that was the whole that was the whole premise of that part of the talk. And then you know, where do you have? But who's doing the work? And what happens to people who don't have work to do? You talk about stuff, things like universal basic income. And okay, there's experiments, I put links in the document of there's experiments that are going on in different places. There's, you know, one right here in North Carolina, and Darren might be part of germs, germs, looking at one. Yeah, that's fine. But that's not going to help us if if everything like if we had another COVID event, and next thing, you know, another 20% of jobs are gone. You know, so what happens when people are ready to go back to work and they can't, because everything was automated, you know, what are we going to do? Do we keep doing these? Do we keep doing these subsidies that we've been doing? The stimulus packages? Do we keep doing that over and over again? Or why don't we formalize it into universal basic income?

Stacy Whitenight:

GET VACCINATED PEOPLE.

Albert Myles:

Oh, don't get me started, Stacy. I was sympathetic, in January, and in February, and in March, and in April, I was sympathetic you know,

Stacy Whitenight:

at some people really cannot get vaccinated. But if it's an active too, it's not getting vaccinated because you do not believe that. This might be one of the Delta variant like statistics, like the 100% of people COVID deaths. Delta COVID deaths in Maryland 100% unvaccinated people like that when you're one of those, it's good feel bad for you.

Albert Myles:

I was saying that to my mom last night, and I feel that I feel a pang of guilt, a pang of guilt, saying it, but it's more like for the people that are ignoring the science and just willfully don't get vaccinated.

Stacy Whitenight:

spreading misinformation.

Albert Myles:

Isn't that natural selection? If I mean, I

Stacy Whitenight:

hate I really feel bad for their families, but like me, too, I feel someone that's not getting vaccinated, get your people in line and take them to get vaccinated. That's all I can say, like the science is there. Yeah, there are things that we don't know 100%. But it looks like 100% of the people who are dying right now are are not vaccinated, at least in Maryland, and I think it's almost that much in North Carolina. And it's showing that way in other states, too. So....I mean, I don't know what to tell ya...

Albert Myles:

I want to branch more market a couple of the last time it ran. And so that's downtown Raleigh and more square. And you said at the time, they were saying to something like 30% of North Carolinians had been vaccinated. We went there, and including my wife and I, there were maybe five people wearing masks. And I'm sitting there out of the hundreds of people that were there, and some sort of thinking, hmm, statistically speaking,

Stacy Whitenight:

You know what, I feel so bad going into places I feel awful going into places like the grocery stores, I mean, not wearing masks more not wearing masks. I have a bracelet says, "I got vaccinated" so I don't feel so bad. I just I don't you know, I don't want people to feel unsafe around me. Like did she get vaccinated? Does she look, she looks like a person that wouldn't get back on our own? Yeah, I don't want people thinking that about me. So I don't

Albert Myles:

see. I've gotten to the point where I'm just I'm I said this a year ago, I'm just gonna wear masks indefinitely. Because

Stacy Whitenight:

my kids aren't vaccinated, and they are dying. Kids are dying, getting sick. And anyways, Anyways, back to AI.

Albert Myles:

Well, so AI is, is they used AI that helped develop the vaccines, right. Yeah. So hey, that's a way to wrap it back around Stacy. So, you know, again, if it wasn't fed the right data, I mean, again, digney start missing things like menstrual cycles,

Stacy Whitenight:

right. that seems pretty basic? But, I mean, I guess, you know, when we're looking at data, not only do you have to look at, like, who's putting this, who needs to be trained, right? Like, we know, we need diversity and inclusion training. It needs it more than most other industries. And it's, I feel like it's just some companies are better than others, in terms of that, and understanding where they are in their DNI efforts. But it's also diversity in the data, and making sure that we're including everybody in the data, and parsing out that data. I'm not a scientist by any means, but seems pretty, like, basic, like that should be one of the first things we would look at, right?

Albert Myles:

Well, I mean, you would think, and, I mean, it's, it's not a simple thing. I mean, I know, I know, I'm sitting here talking about, you know, they should just do this and do that. I mean, I got into AI class that I would I was taking, they teach, they tell you, I mean, they kind of take you through the process of doing the training and stuff like that. And it's it. It's not like it's not like it's rocket science or anything, but it you know, it takes a little bit of effort and you and you do have to select yet there is a manual selection process. So you have to feed the system, the right types of things, like, what an example is, we were doing photo recognition. And so we feed in a bunch of photos, and Watson would look at them and we will say, and it was take a guess at what it was and we will say yes, this is right. This is not right. This is striped, isn't that right? Well, if all of my photos are have narrow, narrow variants here, then of course, it's gonna get really good at training those but I don't I'm not gonna have any pictures of people in Thailand diving off of a cliff or anything, so it's not, it's not gonna be able to test I gonna be able to test against that. So um, so I guess my point of saying that is is that we're using we're hiding behind this excuse of You have to have highly trained people to do this stuff and you don't. And that I think that's just an excuse that they've used to keep.

Stacy Whitenight:

It is on keeping a block on people getting into the industry by making standards of education that are really not necessary to do the job.

Albert Myles:

And we're fighting that fight at the company we work for. Yeah, no. I mean, it's, it's like, Why do someone need 10 years of experience and have all of this stuff's like,

Stacy Whitenight:

it's ridiculous.

Albert Myles:

But you can train them, I mean, you can train someone to program pythons not that difficult to learn. You see what I mean? You say you can train someone to do all of this stuff. But that's the way the industry has been built up, right, is that we want someone with Harvard education and algorithms. Which leaves which keeps us out.

Stacy Whitenight:

Diversity leads to diversity, inclusion problems, and then it affects everything going out from there. You know, as you know, we're talking about sort of the growing pains of AI. What are other things that have come out of this, that you've, you understand or have learned about, that, we can turn this these type of things into a win.

Albert Myles:

I think the big deal is going to be well. Awareness is, is going to be the big thing. So right now is put most people are so painfully unaware of how this stuff affects them. And so I think that if we just started to say, we're going to make people aware that AI is a thing, and we need people to participate. That's going to be a huge step. And if we can, if we can just start doing that, then it's I think a lot of the other things will start taking care of itself. Like, for example, we talk about, you know, diversity and inclusion a lot. But not a lot of people are banging down the door, either. You see what I mean? Because I mean, it's only been recently that most people of color have been interested in tech. Right? You know, when I was in school, we were very farm few and far between. So, you know, it's not like anyone's banging down the doors not to get in. So but I think that raising awareness will be a huge step. And I think once that happens, we'll start seeing people are interested in Hey, well, how do I get into training? How do I get into open source? How do I get into programming? How do I get into all of this stuff? And then I think that's when we're really going to start seeing very effective things like no police cameras, you know, their face facial recognition, I think we'll start seeing a lot better results there. But, you know, right now, I think the industry is trying to force it. And, and it has to come from both sides. So they have to engender interest, instead of just, you know, saying we're going to hire a certain number of this. Okay, now, you got to make people interested in it. And, like, when I was in school, I was called a nerd. Because just because I wanted to, you know, play with computers and stuff like that, because my dad worked for IBM, you know, it was just like, you know, how do we figure out how to make that stuff? Not? On the outside of fun? Right? How do we make sure that that's the jocks like to do it, too.

Stacy Whitenight:

Right. Right, and making it accessible to and we have to look, you know, you said something like, we have to look it just to understand where to look right, but so much of this, you know, we always hear on like the news, you know, SEC is getting involved with stuff. Where did where does the government play a role in this? And how can they help turn AI into a win and stuff pain point.

Albert Myles:

So very interestingly, that Amy Webb book, um, the big nine is so really long story very short, she has a she has these scenarios that she comes up with, and what was completely disastrous, but the scenario that she has, that's the most successful, has a starting, like now, to say that, hey, the government steps up their funding. And you know, instead of paying for all kinds of other things that aren't necessarily they that aren't as necessary, they really start to take AI seriously and we do public private pay. partnerships, to try to say, okay, all of the stuff that Google and Apple and IBM are learning around about AI, messed with the stuff that the universities are figuring out about AI, plus the stuff that DARPA is figuring out how to, you know, let's figure out how to implement that stuff. And, um, spread it out. And, you know, we have to do partnerships with China. I mean, yeah, they have to do the same thing. So because, again, if, if, if their eyes are sitting there and saying, you know, we have to do everything we can to preserve China?

Stacy Whitenight:

Yeah. Well, I think there are a lot of countries that do that, but China definitely

Albert Myles:

Well, but if it's n AI directive, then you know, hat's, you know, it's just onna keep going. And so then ou don't if you don't have the olicies in place to put it in heck, or, you know, or even the ools to put it in check. So I ean, not to go down that route. ut I think that, yeah, it's oing to take someone like Joe iden to say, yes, we've been alking about all of this other nfrastructure stuff. AI is mportant. Let's do it. And I hink people are gonna have epublicans and Democrats are onna have to follow up on it. nd they're going to have to eam up with these AI leaders, nd sit down and decide what's mportant. Kiss me so little ope. You're not scared yet.

Stacy Whitenight:

Right? I don't think. Um, or I think that there's they're in denial.

Albert Myles:

I think that's, it's abstract. Yeah, right. Global warming was kind of abstract. 30 years, people still don't believe it. Okay. It's definitely right. Some people still don't. But you know what I mean, but it was sort of like it was abstract. 30 years ago, everybody were like it, you say the sciences there, but I don't know. So it was always there. So now AI is like, that is sort of like, Yeah, you've been watching Tim, you've been wanting to terminate or too many times, or you've been watching too many movies. And so, you know, it's not that big a deal.

Stacy Whitenight:

That's where the awareness comes in. When there's a general awareness, you know, when we say, you know, someone who may not feel like they're encountering AI every day, like a farmer in the Midwest, they need to understand, like they just showing them and making them aware of how much of their life is tied into AI. You know, look at seeds house either made genetically modified seeds, and how that impacts their livelihood, and their children's livelihoods and their community, you know, like, yeah, I think awareness is a really, really big part of it. And I don't think that that's, that's being included in the data either. So

Albert Myles:

Well, so, you know, so AI is helping to create lab grown meat. Okay, think that think about this, think about this, right now, in Singapore, you can go, you can buy a chicken nugget that has never had a heartbeat. Okay. But my whole point of that is that we're going to start seeing lots of these like, re engineered things, which is differently different. It's kind of a separate, veined and GMOs. So it's a separate vein of stuff, this engineered stuff that could be highly nutritionist and, you know, highly effective for I mean, I don't think it'll ever catch on here. But like, you know, if you go to a third world country can be a big deal. Yeah, people get it. They don't have the data. Yeah, exactly. They don't have all the data. I mean, you know, maybe maybe there's a particular group of people that have to genetics dead, they're intolerant to a certain type of thing, you know, so we created all this food, give these people all this food, and they're all getting sick.

Stacy Whitenight:

I mean, Americans would never do that, like give people blankets with

Albert Myles:

never do that ever. So, so we're seeing al kinds of things where this stuf is spreading out. But again awareness, you know, instead o being scared of it, let's sa well, it's here. So we hav here

Stacy Whitenight:

right? What can we do? You know, we talked a little bit earlier about, you know, the screening process, you know, when we're thinking about learning growing pains and do when and missing and When, when, when you have data that's missing and you're trying to get someone the medical assistance or medical care. They need and you're relying too much on AI. You know, if the computers reading your radiology results, how are we going to make sure that people aren't falling through the cracks?

Albert Myles:

Well, so you and I had that talk about how, given a certain type of Radiology report, you know, computers can be extremely effective at it. But maybe they're really effective at reading, you know, a middle aged white guys, radiology report, but not so effective at reading a younger a younger, black woman's chart, or, you know what I mean? So it's sort of like, okay, you know, you're sitting here in your, in your, in your advertising this, it's 98% effective, it's higher than anything else. And you know, you don't even put in small print for middle aged white men, right?

Stacy Whitenight:

It's 98%, effective, 2% effective for the other category.

Albert Myles:

So did you have these hospitals buying this software in saying, yes, we're going to do great, and they are expecting all these great results. And like you just said, people are falling through the cracks. So a lot of people of color now are in med school, and or a lot of people or a lot of people have colored doctors. Okay, get interested in AI? Yeah, just get there. Get in there. If the training get in there today to get in there for research. Every time you see something, something in that whole area, step up, because this is all our lives at stake. Yeah.

Stacy Whitenight:

I mean, that always sticks out to me like that, that we see is a really has immediate implications in terms of life and death. And where we really need a lot of balance going forward. Yeah, I have my next door neighbor's radiologist, and he reads. So he's like, oh, God forbid, the day, they come over here, and I have to go set up their printer. So

Albert Myles:

Well, the thing is, is that to a lot of the stuff that I've read, like EPA, some of the books in that thing, they all come forth and say that AI is a tool to be wielded. So maybe we it, you know, instead of handing it handing everything over, it's so your radiologists neighbor, imagine that the actual files that he's reading are just so much better, and are so much more clear. So he still has the training. But the information that he's given is so much more clear. And it's easier to understand and it's easier for him to interpret. Then you have said, then that's, that's where the magic comes in. So I don't want to sit here and poopoo AI and stuff. I mean, there is definitely magic to be had. If you have

Stacy Whitenight:

AI can do amazing things. We just need to make sure that we're being inclusive and diverse in our data. Right. And we have safety nets for people, especially women, we're talking about stuff and that we're balanced.

Albert Myles:

It's funny, my mom just discovered Web MD. No, no, no. And she's a bit on the hypochondriac side. I told her I said just take it all with a grain of salt lady just take it. But I mean, Web MD is a good tool cautionary tale, right? Because if it's done right, and someone can research something and get a pretty good information on what's wrong, whether they need to immediately go to the hospital, or they need to schedule a doctor's appointment or if they could just wait till their next checkup. You know, if they can get really good information about that, then great, but right now, it's like, you know, weapon day you everything leads to cancer, right? Exactly. So everything leads that down that road. But again, so you know, I know that weapon date list of companies that use them with MD they, I know people that went over there, and they were using a lot of AI that tried to start helping, you know, experimenting with trying to figure out you put in symptoms here and it kind of gives you an idea. But if all those symptoms are trained on again, john doe. It's, we're, we're gonna have, we're gonna have a problem because when So, when my mom looks something up, she's looking at data based on john doe. Right. And that's Not good.

Stacy Whitenight:

Yeah. And again, it's not a malicious intent. I really don't think it is, I just think it's not thinking about it or, you know, you said earlier, like training people the right way.

Albert Myles:

So, or the assumption that everyone is the same, that's, you know, that, that everyone is has this Eurocentric American lifestyle. And, you know, using that as your norm, that that's something that we really get that these companies have to re evaluate. That's not the norm. I mean, that's actually the minority in this country right now, or in the world is definitely a minority in the world. But in the but in, even in this country, that's now a minority of people. So and I don't think that tech is wrapped our heads around that. I think they're starting, don't get me wrong, I think they're starting. And I honestly think that playing catch up. Yeah, they're trying, but they're so far behind. And I think there's, I think that it's gonna take awareness, I think it's gonna take people applying, like, how many people of color that I know, personally, that had no idea who the company we work for was, they didn't even know that they were, like, God. So, you know, and I mean, a lot of company, may everyone knows Apple, and everyone knows Google. And everyone knows Microsoft doesn't like that. But how many people know who NetApp is? How many people know who Cisco is? How many people know? Boston? Boston robotics? Who makes the robots? You know how many? You know, it's, so we've got to get, we've got to get everyone knocking on the doors of these companies asking to asking. Yeah, yeah. So because they could do all the work they want. But if people aren't trying to get in, it's not going to do any good. I

Stacy Whitenight:

mean, we've included a lot of resources at the bottom. And please include your, the course that you're doing on Coursera, too.

Albert Myles:

Yeah, yay. And

Stacy Whitenight:

people need to go start knocking on doors. Let's go work for these companies. We need. You know, people inside the companies, I think are screaming for diverse candidates to come in and to be included in the process.

Albert Myles:

I mean, just in both of our areas, there's no one gonna work like we're on these, like these islands by ourselves.

Stacy Whitenight:

But it'll get better. We know a lot of people, you know, especially marginalized groups are screaming for more representation top to bottom. So we don't want slow change. So start knocking on the doors. People.

Albert Myles:

Yeah, and tech tech still isn't like, it's not as sexy as like, being a lawyer or being a doctor or something like that. But my gosh, it's so important it is. So you know, I mean, still, if you go to like a typical, a typical university, I still don't think you're gonna see a lot of a lot of people of color, or even women, women a lot more now, but you're not gonna see them in the in like, high tech stuff. They talk about STEM, but a lot of people to stem is biology and chemistry and physics and stuff like that. Okay, that's nice. But here. You could do chemistry with a computer.

Stacy Whitenight:

So, alright, Albert, thank you so much for joining me today. And teaching me all of these things about AI. And I do have hope, because

Albert Myles:

I mean, you can just see the kind of stuff that we're doing in our company. It's not like they're trying and they're opening it and her so just effort. And so that's all we can ask for. Yeah. Yeah. Well, have a good rest of your day. Have a good one safely. Take it easy.

Stacy Whitenight:

Is it honor that you've decided to spend some time with us here today? If you're enjoying this as much as we are, we'd be delighted if you left us a review. Don't be shy, head over to the bossy bees comm drop us a personal note. Find more podcast episodes and read our blog.