Sloanies Talking with Sloanies

Keri Pearlson

October 21, 2019 MIT Sloan Alumni
Sloanies Talking with Sloanies
Keri Pearlson
Show Notes Transcript

Christopher Reichert, MOT ’04, welcomes Keri Pearlson, Executive Director of Cybersecurity at MIT Sloan. They discuss the research MIT Sloan is doing in cybersecurity, including an effort to drive cybersecurity behaviors. Learn more at cams.mit.edu. 

Support the show

Thanks for listening! Find more episodes on our website Sloanies Talking with Sloanies. Learn more about MIT Sloan Alumni on X (Formerly known as Twitter), Facebook, Instagram, and LinkedIn.

To support this show or if you have an idea for a topic or a guest you think we should feature, drop us a note at sloanalumni@mit.edu

© MIT SLOAN SCHOOL OF MANAGEMENT

Speaker 1:

[inaudible],

Speaker 2:

welcome to Sloanies talking with Sloanies, a candid conversation with alumni and faculty about the MIT Sloan experience and how it influences what they're doing today. So what does it mean to be a Sloanie? Over the course of this podcast, you'll hear from guests who are making a difference in their community, including our own very important one here at MIT Sloan.

Speaker 1:

I'm your host Christopher Reichard. I'm with Carrie perlson, executive director of the cybersecurity program at MIT Sloan. Welcome, Carrie. Thank you. So today we're going to do a little differently than we've done before. You're our first non, um, alum, uh, on the podcast. So welcome. Thank you. So, before I get to your work here at Sloan, I'm just want to give a Creek, uh, bio of Carrie for a, for our listeners. So Carrie holds a doctorate in business administration, uh, in MIS from Harvard business school, which we won't hold against her and an ms in industrial engineering and a BS in mathematics from Stanford. So she's quite the slouch. She's held positions in academia and industry, including at Babson college, UT Austin, where I almost went from my undergrad. Hmm. Gartner's research boards, CSC and a T and, T and she began her career at Hughes aircraft company as a systems analyst. So on the background, finally she was the founding president of the Austin society for information management SIM and was named the 2014 national SIM leader of the year. So welcome. Thank you. Good to be here. Well, so where do we begin? Hi. You know, the topic of cybersecurity is so broad and cuts across so many different areas of people's lives. Um, so I guess we want to kind of focus on the, the research that you guys are doing at Sloan. Um, but not so much the technical side, although I'm sure we'll dip into that a bit. Um, but really on the security, uh, cybersecurity culture that, that you'd like to see more companies, uh, partake in, um, and also and focus on how that affects managers and organizations and how they operate. Um, so, you know, it's ironic, I think given that the back in the early days of the internet, we all sort of thought that, you know, the self regulating trust and openness and transparency would really Herald a new utopian environment where you, we would not have the issues that we're facing today. And obviously those rather naive. Right. Um, so I guess the question is how secure are we? Are we

Speaker 3:

today? It's funny you ask it that way. That is the Holy grail question today in cybersecurity. So let me just take a step back and tell you a little bit about who we are, cause that might, uh, structure some of the questions in the way we have our discussion. So we're a research consortium at Sloan called cybersecurity at MIT Sloan. We were founded by professor Stuart Magnic, who many folks, uh, who are alum will know, who's been at MIT for, he likes to say longer than recorded history, although I'm sure it's not that long. Uh, and is one of the founders of, uh, or one of the early researchers in information security. So our research at, uh, what we call Cam's cybersecurity, MIT Sloan looks at the organizational, managerial governance, the business side of cybersecurity. So MIT of course we're a technical school. There's lots of places you can look at the technologies and those are really important questions to answer. But, uh, we think equally important are the people questions, all the things around how do you know how secure you are? How do you build a culture of cybersecurity? How do you manage the dark web? How do you know your supply chain, uh, is secure? Those aren't technical questions. There might be a technical solution, but, uh, research shows that anywhere from 50 to 90% of the breaches that accompany experiences is people oriented. It's somebody clicking on a phishing email or somebody leaving the keys under the mat. No matter how secure your house is, if you leave the keys under the mat, you've opened up a vulnerability. So there are lots of opportunities for the bad guys to get in, if you will, just because people do either overt things they know they shouldn't do or more likely inadvertent things that they didn't know they shouldn't do. So, so that's the sort of the, the foundation of our research. What we've noticed over the past couple of years is of course an increasing interest, not only in what does, how do, how do we keep ourselves more secure, but the boards are now asking boards of directors want to know the exact question you just asked me. How do we know how secure we are, how secure are we? And in many cases we can tell them numbers, we can tell them, uh, how many people fail the phishing email or we can tell them how many end points we have and how secure they are. But just like in the old days of it, those are technical answers. They don't necessarily tell you how secure they are. Those are answers that we can calculate. And so we do and we use those as measures of an answer to how secure are we. But the real question is how do we know how secure we are? And that's one of the questions we're researching right now.

Speaker 1:

Yeah, I can tell you from firsthand experience as a former CIO that the board was definitely asking that as a fairly high profile, high profile organization. Um, and it is, so those you, I would give some of those statistics. Here's how many denial of services we rejected. And here's, you know, how many times we update our security protocol on the exterior, but end points. But I guess, um, I mean, what can we do to increase security as individuals? Is it, it's surely not crazy. Your passwords here, I showed you my password. Is that crazy enough to you? So, so the, the wisdom on passwords is the longer the better part of it, right? You can't remember long passwords, but you could remember a sentence sentence. That's right. And if you have a sentence, I'm sure as a CIO you coached your,

Speaker 3:

uh, your employees, your team to do that. But the longer the better. And if it's a sentence you remember, uh, you know, I work at MIT or something like that. You get all those,

Speaker 1:

those extra words, extra letters, extra characters, all that. You can remember it. Yeah. But the answer to your question is really at the heart of some of our research. And one of the projects I'd

Speaker 3:

love to talk to you about is our culture project happens to be one I'm leading. So I know it best of all of the things on our agenda. But we have a, um, a research project, if you will, where we believe that, uh, we're trying to drive cyber secure behaviors. Those are behaviors that you do because it's part of your job and the things you do just because you're a good member of our community. So there are things that aren't necessarily written in your job description, but they're the right thing to do cause common sense, but also kind of a, almost a pure pressure, common sense. Well, and a, we feel some responsibility for this organization because this is the organization we're part of. So we want to drive, we call it enroll in Outerwall behaviors, but we want to drive cyber secure behaviors. We think that most organizations, most managers think the way to do that is we train you, we send you through a training class. Everybody has that. You go to training, you learn how to set your passwords and what you're supposed to do and not supposed to do. You do it once a year cause it's compliance training and then you quickly forget most of it. You remember maybe something that you do during the year, but next year you go click that button, take that compliance training and you know the university can report back that 95% of the people in the university have taken the compliance training. Again, not how secure are we and not necessarily driving behavior. Our model takes a little different approach. We think that what drives your behavior is what you value. If you personally think cybersecurity is important and building security into our organization is important, you're probably going to have more secure behaviors. So what we look at and what we've, what our model postulates or hypothesizes I guess is that the behaviors of the managers drive a culture which we define as the values, attitudes and beliefs that drive cyber secure behaviors. So our research is looking at, uh, different companies, one of these projects looking at different companies and how do they build in, uh, or how do they change values, attitudes and beliefs. And we believe that happens at three levels. Of course, at the management, the senior management level. If you as the CIO or even the president of the organization do things that are cyber secure, you're going to be sending a message that that's important to you. You make it a priority. You talk about it in your daily work. When the company, you as the president or the COO, any senior level, particularly a non technology leader, a general manager, send a message through whatever your means of communications are, you're going to be sending a message. It's important to you. And that's going to show the value that you have, which we research has shown drives the values and the attitudes and the beliefs of others in the organization. You can do things like start a meeting with a security moment and you can say, you know, let's just take one minute. We know where the bathrooms are, we know how to turn off our cell phones, but let's also talk about how to be secure. What's one thing we can do to be more secure? That kind of behavior drives values, attitudes and beliefs. Do you think that I was, I'm trying to imagine some of those things in a previous organization. I people to have looked at me like I had Hitler's or something like, what's, what are you wasting your tie? Why are you making us all paranoid? Well, I guess you could look at it that way, but cybersecurity is in the press so much these days and it gets down to people's individual concerns. I know when, for example, the Equifax breach happened, my girlfriends called me up and said, what do I need to do? You work in cybersecurity. What do I need to do? And they're not cybersecurity researchers. They're my friends. They're my social friends. Uh, they're as far from it. Leadership is as you could possibly be, but it's on their mind. Yes. So I think today it's a little different world than maybe it was even five years ago or certainly 10 years ago. Absolutely. When, where everybody's kind of wondering about that number one. Number two, we're all carrying these it computers in our pocket. We have iPhones, we have Samsung phones, we have, you know, Google phones, we have computers in our pockets. So we're all a little bit more technologically savvy than we were a while ago. So we're all kind of thinking about technology differently. It's up to the, the technology leaders, I guess in organization as well as the senior leadership to take that on. As, you know, it may may not be a profit center, but it certainly will mitigate being a loss center. Right. Well, I think true. I think that's true. I think, um, people think about technology differently. And the, the other piece of our research, which we're just starting out now, is to look at the maturity of your culture. So if you think about it, pretty much everybody thinks their technology will keep them safe. We can put in firewalls, we can build technology barriers, and you and I as technology people know that that's good only to the extent that it's good, right? We don't know where the next attack is coming from and the bad guys are, as Stewart likes to say, the bad guys are getting better faster, right? Then the good guys are getting better. So we don't know where the next attacks coming from. The technology can only go so far. The next level of maturity is the CIO. This, the technology leadership, we think they're going to tell us what to do and they're going to keep us secure. And that's true. Again, to a point. Um, but the next attack could be something that they didn't even know was on the horizon. So then it's managers manage your people think their managers will tell them what to do and B, help us be secure. And that's the example I gave a minute ago of executives building it into their, their meetings or their security moments, but the culture side. But the, to me, the Holy grail is where each of us take it personally as our responsibility to be more secure. You know, in a, in a um, uh, op, uh, industrial control environment, estriol environment. Everybody takes safety very seriously. You know, and school safety, physical safety seems obvious, right? It's not necessarily your job to be safe, but it's part of the culture. It's part of what you have to do. That's what you do as a good citizen and to keep yourself secure. Why don't we have that same kind of thinking in cybersecurity? You know, that's really where we're trying to get to with our research. We think it's everybody's individual responsibility. Certainly construction for example, certainly ahead of that. They have their signs up for the public to see X number of days without an accent. So it's really built into every time you come on and off the site, you're kind

Speaker 1:

of aware that safety is an important aspect.

Speaker 3:

Well, and you see those posters, my child is expecting me home tonight, you know, I'm not going to do anything that gets in the way of B. And that really gets at the personal side of it too. You know, personally I want to be safe and get home. I've got a family that I care about. But in cybersecurity we don't quite see the same mental leap because it, first of all it's, it's hidden, right? You know, you can click, we actually have seen this happen. We've seen phishing emails sent out, we've seen executives who know it's a phishing email, click on it. And we've asked them why. And you know what their answer is. Well, I want to see what would happen. Well, you know, in an industrial control environment, you wouldn't see a sign that said don't touch this boiler or chiller. And people go up and touch it to see what happens, right? You don't do it, you know, the physical response. But when it comes to technology, it's kind of hidden. So people aren't sure really what's happening.

Speaker 1:

And that might be a bad consequence of having too much faith in technology thinking that, well, I've got an antivirus or I've got to write something in my browser that might protect me about,

Speaker 3:

right. So if you're in the of the of the mindset that, uh, the technology is keeping me safe, you wouldn't have any problem pushing on a phishing email that might do something bad. But if it's, you feel it's your responsibility not to, you're more likely to report that phishing email to the security professionals are challenging the basic business model of bungee jumping cord behind you. That's right. Do you, how much do you trust that big board behind you?

Speaker 1:

So I think that, um, I mean one of the things that, that, uh, I think that's come to the fore is that it's, it's, it's not just a 400 pound guy sitting on his bed, you know, somewhere as a famous tweeter once said, right, right. You know, there is a state sponsored security breaches or attacks. Um, and I thought it was really interesting that one of the research things that you're, you're looking into is using the knowledge of the attackers business model. And, um, and particularly I was, I was, um, intrigued by the Porter value chain model. Um, I find it fascinating just to think of the dark web attackers, not just as that, that, you know, 400 pounds basement person, um, but actually as very sophisticated, very organized, highly funded, uh, organizations out there. So, um, tell me how that fits in and how, how does that, the Porter's value chain kind of fall into that?

Speaker 3:

Yeah, sure. Thanks for asking. So this is another research stream we have going in our consortium. Uh, we're looking at the business of the dark web, if you will, or the ecosystem of the dark web. So what we've noticed is exactly what you said, that the, I guess hacker of today doesn't necessarily imply a hooded, tattooed, pierced PR, uh, fringe individual sitting in a dark room trying to break into your company. And it might be state sponsored or it might just be some business person with bad ethics. Somebody who just wants to steal money or just, you know, uh, somebody who wants to do mischief and they don't have to be a hacker anymore, that there are, uh, there's whole marketplace out there on the dark web of services and you or me or any other business person could go out on the dark web with the right credentials and the right knowledge and pull together all the components that are necessary to build an, an attack factor. And if you don't even know exactly what you're doing, there's a service that'll help you build what you need to do and tell you what pieces you need to, to have. They're not necessarily well funded state agencies, um, or, or governments, uh, outside of, of our government or any government they are, this is the ecosystem. So once you start to look at the dark web as a well organized, well structured ecosystem and we think Michael Porter's value chain model is a nice way to look at it, then all of a sudden you start to see a different way to protect yourself. One way might be that we might start seeing activity around a particular service, in which case we might be able to make some assumptions of the kind of attack vector that might be coming. Another might be that we might disrupt some of these services if we see activity around them. Uh, and that would give you some, uh, give law enforcement or are the other companies that help try to keep us safe, some indication of where to put their resources.

Speaker 1:

So when you say service, you're not talking about banking services, you're talking about literary at the sort of port level services on

Speaker 3:

no or talking business as a service, any as a service. So there are examples like, well, and not all the services that are designed were designed for bad guys. So for example, uh, there's services that, uh, take Bitcoin, you know, and that's not necessarily designed for a hacker, but in the hands of a mal intended person, it could be used as part of an attack vector or ransomware attack. For example. Uh, sorting through, uh, or sending out phishing emails. W companies send those out all the time to try to figure out if, uh, their company is safe, but in the hands of somebody trying to launch an attack vector, it could be knocking on the door to see where the, where the keys are under the mat, where the person is that's not vigilant. So these services are well formed components of the value chain and that's why we liked the value chain model.

Speaker 1:

And I guess that's, there's something that's always confused me is that the, the internet and the whole technology, it is a, is a human construct. It's defined by protocols and you know, you know very much about it. Um, and so in some ways we kind of know, I guess this is, this is the confusing part for me. Um, phone numbers for example. You know, these are things that are assigned by, uh, an authority of some sort of telephone company or whatever, or Google or whatever it may be. Um, same thing for IP addresses. Somebody knows that who assigned that IP address to somebody else. Now, I know there's all sorts of ways of, you know, tours and hiding yourself, uh, in there. Um, but isn't there, is one of the ways to approach actually finding the source of the problem to unpack some of the way that the whole internet is structured?

Speaker 3:

That's a really good question. Um, I think that would give one Avenue of perhaps defense or building defenses and protection. Uh, I'm not sure that it's easy to unpack the internet. Uh, the way you're talking about it. It seems to me that part of the, the value if you will, or the, the, the positive side is that it's so distributed by its very nature, by its very nature and, and actually this plays, I'm, my brain is going another place, which is one of our projects on blockchain, which is also sort of the same concept of distributed ledger in that case, but distributed information to actually change something, uh, when it's that distributed is really hard. Uh, perhaps impossible, uh, perhaps technology in the future we'll be able to do that. But it seems at this point in our evolution of technology, it's, um, it's something that could be done. I just don't, I personally don't know of any solutions for it, as is IPV six came out that that would be kind of like, Oh, let's add the security layer in there from the get go. I think a fingerprint. But I think we do ourselves a disservice when we think that technology is gonna keep us. So there you go. I fell back into the trapping. Yeah, see, there you go. I think it helps and I think, you know, and we're seeing a lot of advances in AI and ML also trying to unpack the information that we're seeing and try to understand where the next vulnerability might be. But, uh, I, I think, you know, no matter, as I said before, no matter how good we get, we have to, we have to keep all the doors safe. The bad guys only have to find one open door. Right. So it's, it just seems like a, an almost impossible task to keep all the doors safe. So, so I've gone down the technical path. My mistake. No, not at all. I don't want to bring it back to the culture side of things. Um, so what's the kind of work that you're doing with organizations and how do you, how do you recruit them? How do you, how do you, um, choose good possible candidates for it? Whether it was one ones with bad culture, perhaps, I mean that to say that with air quotes and others that, and how do you measure sort of success, uh, in the, in the research or even the findings? So in the culture project, um, the culture project started about three years ago, two and a half years ago. We camps, uh, has an annual conference. And in July, two years ago we had a workshop where we brought together all of the companies that support us and we said, how do you keep your people secure? That was just the question we asked in the workshop and we broke into small groups and everybody talked amongst themselves. Then we reported back and we realized that there was some patterns, what we were hearing from the hundred or so people in the room. And so that launched the project and then we reached back out to our companies who are a part of our membership base. Uh, and we've done some case studies on them. We've published some papers on them. Many of these are available on our cams website. Some are publicly available. So if any of your listeners are interested, they can go to our public website and uh, and take a look at some of these papers, but give us the URL and that. Sure. It's a cams.mit.edu, Kansas cybersecurity at MIT Sloan C a M. S. great. Uh, and so, uh, we started out with, I would say leading practices. We don't really know what's the best, but we see some leading practices and us, I can give you some examples from that if you'd like, but we've documented several companies that have some interesting leading practices. Uh, we've then created a, um, a survey and the surveys been given out to 150 to 200 different companies. We're in the process of analyzing that data right now to see if we can validate some of our hypotheses. And we are still collecting data right now. We're really interested in global issues. So how does the local culture, us culture being different from Brazilian culture being different from Italian culture? We think local culture plays into how you're going to view cybersecurity. We think things like, um, local, regular, not just local, but any regulation. You know, if you're a bank, you're going to be subject to different kinds of financial services laws than say consumer products company. If you're operating under GDPR, you're going to be much more aware of privacy issues then uh, well pretty much everyone is today, but you know, it's going to be different for different companies based on that. So those are going to drive the values, attitudes and beliefs also. Um, but, but really interesting is your question about how, how do you know you're successful? We've asked that of every company we've talked to and nobody has a good answer. In fact, our case studies, which we use for teaching end with that, what should we use to measure? How do we know what is the right way to know if we've been successful? We can't tell you were successful just because we haven't had a breach. Right? Because again, or it just because nobody fails, the fishing exercise doesn't mean you're more secure. We don't know the answer to that question yet. That's another research stream we have. We would love to have people fund us in that. By the way, if somebody who's listening here is interested in answering that question, they should get in touch because we really want to figure out that answer. How do we know, how do we answer the question? How secure are we

Speaker 1:

and how do, um, how do people get more involved in learning it? So would they get approached you to have their company be part of the, the research or are there things that the professionals can do to become better at it themselves, whether they're the culture or the technology side?

Speaker 3:

Yeah, so that's a, that's a really good question and thank you for giving me an opportunity for the commercial cams is, and we're like every other research center at MIT, we're self funded. Uh, we, people join us, uh, companies join us. So we have a number of companies that are sponsors of our research. So any of your listeners who are interested in that, they can again go to our website at[inaudible] dot mit.edu get information or reach out to us. We also do sponsored research like any other part of MIT. So if somebody has a particular question they want answered, we are totally open to trying to see if any of our research answers that or if they're interested in sponsoring research. That would answer that. Of course we'd love gifts too. So anybody wanting to donate money, we'll take that. Uh, but uh, individually, how can you, how can you raise your awareness? So, uh, we want to solve that problem too. So we're partnered with Sloan exec ed programs and there are two, uh, that are teaching from our research. One is an in person program offered three times a year and a one is a online program offered I think more than three times a year. You can do one from the online program, from wherever. And we have people all over the world that do that. We also have a Sloan graduates and Sloan students who regularly just hang out with us in one way or another. And, um, somebody who's really interested in getting involved, they can reach out and I can talk to them about that also.

Speaker 1:

Excellent. Wonderful. I, there's so much to cover. I was thinking about IOT security and endpoint issues. Uh, you know, there with so many devices being smart and removed from a premise or put into circumstances which it's, uh, impossible to predict. You know, a phone could be lost a few, you know, a, a computer could be lost. It could be breached in many different ways at an airport or whatever the case may be. Uh, but even, but in thinking on an industrial level, you know, they know that, uh, things are designed in one way and they're put into environments that are remote from the provider. Uh, and they're, uh, and so there's the, the, again, that control, um, is an issue. What's the, what's the research that you are doing on the, um, endpoint cyber security and IOT components since I think that's really going to be, you know, a huge growth area for industry.

Speaker 3:

So, uh, we have a, another research stream. We have about 35 projects in various forums. So if your listeners are wondering why we're covering such a broad map, I should say that, uh, we have so many different projects that we're doing. We have a few priorities, but we mostly look at the uh, strategic governance, managerial and organizational issues around cyber. Uh, and endpoint security is another one because there are technical solutions. But the question really becomes a managerial question. How do we keep our endpoints secure? Uh, most organizations put a defense in depth strategy in place where they have multiple levels of security, if you will, one level or maybe multiple levels is around policy and people, some levels are the firewall and the technologies that we normally think about for security, um, securing the perimeter is, uh, another piece of that defense in depth strategy. And end points of course are important. You know, as all the lights get connected and as, uh, will Stewart likes talk about his toothbrush, you know, uh, he, I think his wife has a toothbrush that's connected to the internet. Now you may wonder, why would you want to connect your toothbrush to the internet? Right? Uh, you know, but there could be benefits to your dentist being able to monitor your habits and um, and, and help you have better oral health because your toothbrushes connected. So we're pretty much seeing everything connected at some level. So then the question becomes how do we make sure they're secure? It's really nice that they're all connected. We all see the benefits. Um, but the minute it's connected, it then opens up a vulnerability. So the particular research you're referring to is looking at, um, white list versus black lists. So how can the, the, the space on these IOT devices is much smaller. The storage area, the, the, um, I guess the, the amount of code you can put on there is limited, whatever it is. Right? Exactly. So that project is looking at how do we white list, instead of saying what shouldn't be accessing us, let's just make a small list of what should access us. And that's a management question, you know, how do you wanna manage that? Of course, there's a technology component and our researcher is building prototypes to show how to use white lists versus black lists. But the other piece of this is block chain. People seem to think the blockchain is the answer to a lot of this. That you can use the blockchain as a means for securing endpoints. Uh, we have mixed thoughts on that right now. We do think the blockchain technology itself has just like distributed other distributed technologies has secure it, has, has some play in the security world, but it's the system around it that gets in the way. And, uh, I think one of our researchers wrote a paper about the 70 ways that blockchain's been violated. Uh, I think that's in one of the public, uh, popular press, uh, magazines these days. But it's not secure itself because it itself is in a system. And then all of a sudden you've gotta be making sure that all the aspects of the system are, um, are secure. So those are, those are the two of the avenues we're exploring with our endpoint security. The other piece that plays into this is what I mentioned earlier, that it versus OT. So operational technology versus information technology. Uh, we started out as a critical infrastructure focus. So most of the organizations we worked with were part of energy or oil and gas or telecommunications critical infrastructure companies. Uh, and in those organizations, the operational technology people have a very different approach to, uh, information security if you will. Then the it people that stuff like you and I used to do in offices and stuff. So they already think about security in a different way. And there are lessons we've talked a few about with, you know, from the physical safety that apply to the cyber world that, um, play in that world. But we also have another research stream looking at cyber safety in OT worlds. And we have quite a few projects looking at energy delivery systems and keeping operators up to speed. And looking at the whole thing as a system using systems dynamics of course, a popular approach from MIT to try to keep the whole operational system in industrial control systems safe.

Speaker 1:

Yeah. I was thinking about the, uh, you know, the way Tesla pushes out updates, uh, to, to vehicles, you know, automated. Right. And so in some ways the, the end user doesn't have, I suppose they may have given control to Tesla to do that from when they first got the car. Um, but you know, I can see how that would be a huge risk if you're relying on, but then that also relies on all sorts of other systems, GPS to keep you in your lane and distance and radar and as all sorts of ways there could be shenanigans. Uh,

Speaker 3:

yeah, most of the autonomous vehicles is another Avenue. We have a researcher looking at that. We actually haven't dived Tesla per se. We've been looking more at unmanned vehicles used in, in mining and military and other applications. It's the same technology, the same idea, right? Or culture. Agriculture. Yeah. So once these devices are unmanned or computer controlled, you know, they can be controlled to do bad stuff just as they can be controlled to do good stuff, you know, to avoid pedestrians or hit pedestrians. You know, it's a matter of code. It's a little scary. So we want to make sure that, um, we bring the MIT brain power, if you will, to answering those kinds of questions. Also a, but yeah, to the, it is an issue when you start to look at distributed, um, uh, uh, updates, but you see that even in the it world, I'm sure you get regular updates for your laptop. I do on mine, we just click and say yes, cause we assume they're, they're the right ones. You know, it'd be really insidious if somebody sent me an email that said, by the way, here's an update to your laptop. And I clicked on it and it was actually some sort of malware. Right. You know, and that's the, that's the sort of phishing thing that we're, we talked about a bit earlier. Um, and where there are things that people can do to just, I mean, I know there's the tests that you know, that you do right, that people pass or fail, but are there some obvious tips that we don't that are, or we don't know about that you can pass on? Well, I'm not sure you don't know about them, but let me tell you a few tips that I, that we regularly tell people. So first of all, if uh, if an email phishing has usually when you receive an email from somebody who either is general like, uh, I have a great uncle in Nigeria who just died and left me$10 million and I want to give you a piece of that, send me your bank accounts. I think most of us know that's fake. The stories, by the way, the people who taught them for a long time. Yeah, that's true. That's true. You could have a lot of buzz with them. Yeah. But there are people who unfortunately do fall for that and they're usually elderly or uneducated or go to quickly. You're just kind of doing your quickly. So, obviously there are things that one end of the spectrum that we all recognize, there are things at the other end of the spectrum that we would have a really hard time recognizing. And there are stories you can do Google searches on some of these where you get an email that looks like it's from somebody you know, uh, usually the, the, we call these spear fishing because they know something about you so they can write the email in a way that makes it sound like it's plausible. We've heard stories and documented examples of executives that were out of touch sending an email and I'm air quotes, sending an email to somebody, you know, please wire$10 million and don't tell anybody because I'm working on this secret deal in China and I need this money tomorrow. And it turns out that that was spear fishing. So both of those, those are two ends of the spectrum. And of course everything you could imagine in between and things you can't imagine in between. You know, we can't even think of them all. So how do you stay safe, first of all? Anything that looks suspicious, investigate. Don't just quickly look at your email and answer things. Don't quickly click on links in your email that might be suspicious. Um, err on the side of suspicion in your emails only because right now the technology can't even keep up with all of the bad emails that do get through to your system. So that's number one. Number two, if there's a, a link in your email, hover over it. Most email systems will tell you where that links going before you click on it. Look like Amazon, but it's something else. And look carefully at it. It might say microsoft.com. Dot. Something else. Right. You know, so it looks like it's Microsoft. It looks like it's Google. It looks like it's Amazon. But if you look carefully, it isn't. We regularly send emails amongst ourselves saying, I got this one. Do you think this one's real or not? And you know, we, we jokingly try to, you know, catch each other. I mean, none of us are clicking on and we, we use it as, as discussion amongst ourselves. Right? But, but hovering over it. And then as a last resort, send it to your it people. Almost every company has an it and a mailbox, right, that you can send spam or investigate it before you click on it or call the person that it's from and see if they really sent you that email. Um, I get this one email regularly from a former colleague from my SIM chapter. He has email for some reason is consistently hacked and I keep getting these emails. The last one said, um, I need some, I need some help. I need a credit card. I need a, uh, um, uh, a debit card from Walmart. I really, I need, I need$100 worth of stuff. Could you help me? Could you go to Walmart? I'll pay you back when you bring it to me or something along those lines. And I called him and I'm like, is this, did you know, you really have, and he didn't know him, you know, so he didn't even know to go after to start that kind of thing. Cause it's just, it's silent. Yeah. So they think silence. Exactly. That silences the deadly is what we all were about. Silence and fear, fear, fear, uncertainty and doubt. Right. You know, don't let that get in the way. If you see something that looks suspicious, it's that we'll see something, say something. If you see something, ask the person, bring it to the surface. Don't just assume that it's right. It's good or bad, right? The more we talk about it, the more aware we become. Absolutely. Well, before we go, I have two questions for you. What's your personal definition of success? And that's so much work. It could be work, it could be anything else. My personal, definitely. You mean for success for me or for you? How do you feel? How do you feel successful? How do I feel successful? I feel successful when I see that aha moment in somebody's eyes that they get this new idea that we're talking about and they want to engage in discussion about how to make it even better or how to implement it in what they're doing. I, I think I'm a teacher at heart. So this whole, uh, bringing these new ideas to the surface in a way that, uh, is communicated in a way that people understand is, uh, something they hadn't thought of, perhaps outside the box. Uh, but that makes a difference in the way that they view the world or go about their daily business. For me personally, that's success. What's the last thing you really geeked out about or maybe given your work you besides this discussion? I geek out regularly[inaudible] opposite of that. The last thing I geeked out about, um, well personally I have a textbook that I've written. It's in its 20th year 20th anniversary is this year. Thank you. We're updating the seventh edition and I geek out regularly about that. And uh, an ironic story is the first one I wrote, which was back in 2000, I was not doing this cybersecurity research. I was a researcher just in general in it leadership. And the book is around the managing and using of its or information systems. It's written for MBAs who need to be knowledgeable participants in it discussions. Because of course back in 2000, you might remember this business, people just thought that we were taking care of all the technology for them and we'd make decisions that impacted their business opportunities. So this textbook was originally written based on a teaching I did at the university of Texas at the time. Uh, but over the years it's evolved. And this last, this last one, these, uh, security chapter was super important to me cause now I'm in cybersecurity. So we were geeking out with my, one of my coauthors about what do we put in here? How do we really talk about cybersecurity? It's more than just awareness and training. I really believe it has to be part of the ether and part of the, um, the environment and what we were just talking about earlier about the culture so geeked out about it.

Speaker 2:

And the good news is I think that people are getting more sophisticated in the conversations they're having. It's an, it's far beyond just the passwords and firewalls. It really is, as you say, it really needs to be put into the DNA of a culture of an organization. So sounds good. I'd like to thank Carrie perlson, executive director of the cybersecurity program at MIT Sloan for joining us on the ideas that matter. Thank you. Thank you. And I'm your host Christopher record 10 til next time.

Speaker 4:

Yeah,

Speaker 2:

he's talking with Sloanies is produced by the office of external relations at MIT Sloan school of management. You can subscribe to this podcast by visiting our website, MIT sloan.mit.edu/alumni or wherever you find your favorite podcasts. Support for this podcast comes in part from the Sloan annual fund, which provides essential flexible funding to ensure that our community pursue excellence. Make your gift today by visiting giving.mit.edu/sloan to support this show, or if you have an idea for a topic or a guest you think we should feature, drop us a note@sloanalumniatmit.edu.