Bee Cyber Fit: Simplifying Cybersecurity for Everyone

Why Reporting Suspicious Cyber Activity is Everyone's Responsibility: Insights from Yale's CISO Jeremy Rosenberg

May 02, 2023 Wendy Battles/James Tucciarone/Jeremy Rosenberg Season 2 Episode 4
Why Reporting Suspicious Cyber Activity is Everyone's Responsibility: Insights from Yale's CISO Jeremy Rosenberg
Bee Cyber Fit: Simplifying Cybersecurity for Everyone
More Info
Bee Cyber Fit: Simplifying Cybersecurity for Everyone
Why Reporting Suspicious Cyber Activity is Everyone's Responsibility: Insights from Yale's CISO Jeremy Rosenberg
May 02, 2023 Season 2 Episode 4
Wendy Battles/James Tucciarone/Jeremy Rosenberg

Send us a Text Message.

Do you know that everyone at Yale University has a role to play in keeping Yale data and systems safe?

In today's episode, we welcome back Chief Information Security Officer (CISO) Jeremy Rosenberg. 

He reminds us that a simple action we can take is to report suspicious or unusual cyber activity right away - it's everyone's responsibility.

Listen to this episode to get the insider scoop on incident reporting from Jeremy. 

You'll learn:

▶️ The most common cyber threats at the university and the impact of AI
▶️ Why reporting unusual activity helps thwart bigger issues
▶️ Why we have to assume that incidents are urgent even when we're not sure
▶️ What happens when you report a phish using the button in Microsoft Outlook
▶️ What to say/do should you be involved in an incident

Plus our buzzword of the day - "malware"

And lots of other insights, information and inspiration to protect the confidentiality, availability and integrity of Yale data and systems.

*********
Calls to Action:

Ready to build your cyber muscles, outsmart cybercriminals and hone your incident reporting skills?

Here are several simple actions you can take:

  • Read Jeremy's April message about the Bee SAFE, Not Sorry campaign and what you can do.
  • Review Yale's Report an Incident page about how to report suspicious behavior.
  • Register for Bee SAFE, Not Sorry events in April and May.
  • Complete our Bee SAFE, Not Sorry puzzle to build incident reporting awareness. Submit your answer by May 31 for the chance to win a prize pack.

Learn more about Yale Cybersecurity Awareness at cybersecurity.yale.edu/awareness

Never miss an episode! Sign up to receive Bee Cyber Fit podcast alerts.

Show Notes Transcript

Send us a Text Message.

Do you know that everyone at Yale University has a role to play in keeping Yale data and systems safe?

In today's episode, we welcome back Chief Information Security Officer (CISO) Jeremy Rosenberg. 

He reminds us that a simple action we can take is to report suspicious or unusual cyber activity right away - it's everyone's responsibility.

Listen to this episode to get the insider scoop on incident reporting from Jeremy. 

You'll learn:

▶️ The most common cyber threats at the university and the impact of AI
▶️ Why reporting unusual activity helps thwart bigger issues
▶️ Why we have to assume that incidents are urgent even when we're not sure
▶️ What happens when you report a phish using the button in Microsoft Outlook
▶️ What to say/do should you be involved in an incident

Plus our buzzword of the day - "malware"

And lots of other insights, information and inspiration to protect the confidentiality, availability and integrity of Yale data and systems.

*********
Calls to Action:

Ready to build your cyber muscles, outsmart cybercriminals and hone your incident reporting skills?

Here are several simple actions you can take:

  • Read Jeremy's April message about the Bee SAFE, Not Sorry campaign and what you can do.
  • Review Yale's Report an Incident page about how to report suspicious behavior.
  • Register for Bee SAFE, Not Sorry events in April and May.
  • Complete our Bee SAFE, Not Sorry puzzle to build incident reporting awareness. Submit your answer by May 31 for the chance to win a prize pack.

Learn more about Yale Cybersecurity Awareness at cybersecurity.yale.edu/awareness

Never miss an episode! Sign up to receive Bee Cyber Fit podcast alerts.

Jeremy Rosenberg: We are all stewards of the resources that we manage at Yale, and making sure that you understand what your responsibilities are is important, and this is one of them. We can only do so much centrally. We have to find a balance at Yale between providing the freedom that people need to maintain their academic pursuits and being able to keep the place secure from attackers. The only way we can do that is to rely on people to be partners in this. And so, really, everybody is an extension of the security team in that way.

[theme music] 

Wendy Battles: Welcome to Bee Cyber Fit podcast, where we're simplifying cybersecurity for everyone, where we cut through confusing cyberspeak and make cybersecurity simple and easy to digest. I'm one of your hosts, Wendy Battles. 

James Tucciarone: And I'm James Tucciarone. Together, we're part of Yale University's Information Security Policy and Awareness Team. Our department works behind the scenes to support Yale's mission of teaching, learning, and scholarly research. 

Wendy Battles: Ready to get cyber fit with us? 

Hi, everyone. Welcome to another episode of the Bee Cyber Fit podcast. We are so happy you're here. This is the place to come for information and inspiration to keep our Yale community and beyond, safe online and beyond the clutches of online thieves. This is part two of our series on reporting security incidents at Yale. In part one, we talked a bit about spotting and reporting suspicious cyber behavior, and we also talked about our Bee SAFE, Not Sorry model in our episode. 

James Tucciarone: In our episode today, we're talking with Yale's chief information security officer Jeremy Rosenberg. We're going to be asking about some things like the most common threats at Yale. Why it's so important to report suspicious cyber activity in a timely manner, and why we shouldn't be scared or embarrassed about reporting. 

Wendy Battles: Before we jump in, let's hear about our buzzword of the day. 

James Tucciarone: How would you refer to malicious software intended for compromising our computers and devices? Perhaps you thought of the terms virus or ransomware, and you could be technically right in either case. But did you know both would fall under the umbrella of malware? Stay tuned to learn about the differences between malware, ransomware, computer viruses, and a few more. 

Wendy Battles: James, I'm so excited because we are welcoming back our very own CISO, Jeremy Rosenberg, to the Bee Cyber Fit podcast for a conversation about something really important, about reporting, about how we report security incidents. So, James, what do you have to say about all this? 

James Tucciarone: Wendy, I am thrilled that we have Jeremy back again today. I know that last season, he was one of our most popular episodes, and I think some of the information he's going to share with us today will make this one of our most popular episodes as well. 

Wendy Battles: I agree. I bet it's going to be a highlight of Season 2. And, James, you are right. It is, I believe, the second most popular episode of Season 1. Only second to our intro episode was Jeremy. So that's a great sign and it means that he is going to be sharing engaging information with us today to help keep us safe. Jeremy, it is great to welcome you back to the Bee Cyber Fit guest chair. 

Jeremy Rosenberg: Sure, but no pressure. 

Wendy Battles: Ah, never any pressure. I mean if anyone can handle the heat, it's you, Mr. [crosstalk] with the background and radio. 

Jeremy Rosenberg: Yeah, well, I guess, yes, that's right. I guess this is my job to handle the pressure but thank you. I had a lot of fun last time and I'm super impressed with how the podcast has been going, you two are doing a fantastic job and I appreciate the chance to come back and talk about what we do again. 

Wendy Battles: Thank you so much. And I know you have a lot of wisdom to share with the Yale community about security incidents, things that happen at the university. We're going to talk a lot about that. We've got a lot of questions for you because we ultimately want to help our community better understand what we mean by security incidents and how to report security incidents. We know you can fill in a lot of that with some really helpful background. So, we thought it'd be great to start at a really high level and ask you about what you see as the most common threats that occur at the university. 

Jeremy Rosenberg: Unfortunately, the most common threats are not all that exciting. I mean, it's just a high volume of things like phishing attacks. People sending emails, trying to trick you into giving over your username and password. It's actually amazing how many computers are out on the internet just constantly hammering away at other computers, trying to break into them. And so, by some definition, the most common threat are these just millions and millions of hits that our firewalls block every day of computers trying to attack us. But I don't even count those anymore because they're kind of low effort. It's really these social engineering attacks where people are really trying to trick Yale users, members of the Yale community, into handing over credentials or downloading some malicious software or letting them install remote control software on their computers. 

Less common, but also quite concerning is stolen devices. I think in the last six months, we had 22 stolen laptops and so depending on how well those laptops are configured and secured, those can be problematic. So, those are the run-of-the-mill stuff. Like I say, it's trying to contain those things so that they don't turn into much bigger deals. That's why we talk about having people make sure they're reporting these things so that we can help them when making sure that a phishing attack that they may answer doesn't turn into something much bigger. 

Wendy Battles: Yeah, so, really being proactive and identifying things and reporting them sooner rather than later. So, to your point, it doesn't become this bigger thing. You mentioned social engineering, and it comes in many different forms. Are there any particular social engineering things you've seen at the university? Any particular phishing attempts that stand out to you that people may or may not be familiar with? 

Jeremy Rosenberg: Well, so, what's interesting is it's a real cat and mouse game that we play. We're constantly making improvements to our technology, to our education. The awareness team does a great job of helping people spot phishing attacks and so that sort of causes our adversaries to up their game. And so, one example of that is multifactor authentication. When I first came to Yale three years ago, we were not requiring multifactor on everything and including email. If somebody got a hold of your username and password, they could access your email. Since then, we've actually deployed multifactor everywhere. You need a username password and then you have to tap your phone or put in a code or something to get to your email. Well, so that's caused the bad guys to up their game. 

We've recently seen a new type of attack where they will phish somebody and they will get them to give over their username and password, but also their cell phone number. They'll say it's some kind of a form where that they have to fill out. They'll pretend to be from the Yale helpdesk needing their information for some reason, and then they'll actually wait a week or two. What they'll do is they'll use that username and password to start a login and then they will use the phone number to actually send the person a text pretending to be from Yale and then convince them to hand over the little code that they need to get past the second factor. So, that's an example of how we're starting to see more sophisticated attacks. 

On the one hand, it's good, we're making them work harder, but we're also seeing that they're willing to put in the effort. Now with this explosion of these large language models in AI that can have a conversation that sounds a lot like a person, I worry that they won't even need to have an actual attacker having these conversations with people to convince them to hand over their codes. They'll actually just have AI bots doing it and we'll be back at dealing with this at a large scale. 

James Tucciarone: So, Jeremy, am I correct in thinking that these multifactor authentication attacks, these phishing attacks, when we act on these attacks, is that what leads to a security incident? What we get questioned a lot of time is what is a security incident and what should we be looking out for? 

Jeremy Rosenberg: It's a great question. We talk about an event. We try to keep it nice and generic. An event is anything that happens that could be weird. So, the fact that you answered a phishing message and even if you clicked on a link, or even if you put in your username and password, that's an event. Whether or not it's a security incident depends on whether or not it results in some kind of unauthorized access to a Yale system or data. We look at three different things in security. We worry about the confidentiality of data, availability of systems, and integrity of information. If one of those three things is compromised, then we have a security incident. If somebody has read something or seen something they shouldn't have, then it's a compromise of confidentiality. If something becomes unavailable, then it's obviously an availability issue. If somebody changes something they shouldn't have, it's an integrity issue. So, those are how we define a security incident. 

Anybody can spot an event. Anybody can tell something funky is going on here. We need you to report it to the security team, so that a security engineer can take a look at it and they will determine whether or not it's defined as a security incident, and then we can run one of our specific plays. That's why it's really important to involve experts. And Yale has made big investments in having our team available to do these sorts of things, and everybody should be comfortable taking advantage of that. 

James Tucciarone: What are some of the common red flags that people might see if their data has been compromised? 

Jeremy Rosenberg: Good question. Sometimes it's subtle, and I don't want people to be alarmed just because their computer is running slow doesn't mean they've been compromised. But a computer that's acting weird, that is way slower than it usually is, that could mean that there's some kind of malware running on it. We have software that we can run to detect that. If you're using a Yale-managed computer, then they all come preloaded with a tool called CrowdStrike. I strongly recommend everybody get their computer managed by Yale, because this is a very powerful tool for not only detecting, but preventing your computer from getting compromised. 

I was dealing with one not too long ago where somebody was-- so their colleague said, “Hey, did you get that email I sent you?” And they couldn't see it anywhere. They're like, “No, I didn't.” “Oh, but you replied to it. I can't find it.” It was just something strange going on. They actually ended up screen-sharing each other's screens. They could see that person B received an email from person A, and we could see it in person B's inbox, but there was nothing in the sent box of person A. I apologize if I'm getting this all confusing, but basically, the person who allegedly sent it couldn't see it in their sent box. It turns out that somebody had compromised his email and put in rules. They put in some weird email rules to delete emails as they're being sent or to automatically move emails from this one person out of the inbox. They do this so that they can carry on a conversation with somebody pretending to be you while you continue to use your email and you don't know it. 

Anything that seems weird, like that, where you could have sworn you sent an email. At one point I believe this person actually saw an email come in from their colleague and it immediately disappeared because it takes a second for the email rule to kick in. So, stuff like that, you shouldn't hesitate to reach out to the security office if something seems out of the ordinary, it could be nothing and that's fine, and then we'll very quickly help you sort that out. 

James Tucciarone: It sounds like some really good advice is to just go with your gut. 

Jeremy Rosenberg: Yeah. 

James Tucciarone: Right, if something seems off, maybe think about it, maybe take a second look at it. 

Jeremy Rosenberg: Yeah, I mean, you live on your computer all day long, you know when something's out of the ordinary. 

Wendy Battles: Jeremy, I heard you say two other really important things. And one of them is when in doubt, reach out for help. That's a simple action that people can take. So if they trust their gut feeling and it feels something is off instead of just doing nothing, it makes sense. Are you saying to take action to call your support provider or call the helpdesk, whoever might be able to help you navigate that thing that seems a little strange. 

The other thing I heard you mention a little earlier was that if you don't have a managed workstation, because our managed workstations have CrowdStrike, which is this really powerful tool to help us, that people can get that. They can request that. What would they need to do to get a managed workstation if they don't have one, is that a pretty simple process because it seems like a very simple action that someone could take that's listening right now? 

Jeremy Rosenberg: This is not just at Yale, wherever, universities are interesting places. Some folks at Yale have specialized equipment requirements and that means that they have a lot more responsibility on themselves to maintain their computer to minimum security standards. When if you have fairly standard computing requirements, sometimes people aren't aware that Yale offers a service through central ITS to provision them with a computer. It's not only that it runs this special software and we're not selling the CrowdStrike product, it happens to be the one we chose. They also maintain upgrades and encryption and things that you need but you can reach out to ITS. 

I'm sure you can go into our service catalog and find the service. They actually have a number of options and they're actually adding new options for what kind of computers you can run all the time. You can even get a virtual desktop, which is a computer that lives in the cloud and you just log in and use it remotely and it is secure, but you only need it when you're accessing secure things. So, they have great options. Having a professional maintain your device for you if you don't have any extraordinary needs is a great way to avoid even getting into these situations. 

Wendy Battles: That's really great advice. We'll link to some of these resources in the show notes so people that are listening can easily access them if they already are not already doing that. One of the things that you're talking about, it feels like, to me, is that reporting incidents is really everybody's responsibility, that we all have a role to play in that. Would you say that's accurate? 

Jeremy Rosenberg: I'd say it's part of your job if you work at Yale. We're a prestigious institution, people have high expectations for not only the education they'll receive, the resources they have access to do their work, but also that their data will be taken care of. We are all stewards of the resources that we manage at Yale, and making sure that you understand what your responsibilities are is important, and this is one of them. We can only do so much centrally. We have to find a balance at Yale between providing the freedom that people need to maintain their academic pursuits and being able to keep the place secure from attackers. The only way we can do that is to rely on people to be partners in this. And so, really, everybody is an extension of the security team in that way, and so we have to act as a team. 

James Tucciarone: Jeremy, I love that you talk about this idea of partnership and that really reporting security incidents is part of our job because I think that there is often this stigma with reporting security incidents that we might get in trouble. Or people are worried about what the repercussions would be if they maybe did something wrong or something that caused the security incident and are free to report that. What would you say to somebody who might be a little hesitant or nervous to report a security incident? 

Jeremy Rosenberg: Yeah, I don't know if people are scared or embarrassed or if they just don't realize it's important, but people shouldn't be scared. I mean, once you're at the point that something's gone sideways, it's already happened. You can't fix it. I have seen people make it worse trying to cover their tracks and stuff, and that just ends up ending badly. Like I say, it's part of our jobs, you can't get in trouble for doing your job. You're being a good steward of Dell's-- Dell [chuckles]. Yale's data and systems. Dell's data and systems too. I mean, Dell's has every right to protect the data too. [Wendy laughs] But you can't be scared to do your job, and this is part of your job. 

I think there was probably a time when this was not as well understood and it can be embarrassing. I mean, it's not fun to admit that you clicked on a link and gave your password out to the wrong person, but it happens to the best of us. I'm not reliving my own shame, but go listen to Season 1, Episode 3 to hear my tale of shame. You have to understand the people who are perpetrating these crimes against us, it's all they do. They're professionals. There's no expectation that you, who have a job that you're working hard at should be able to recognize it before it happens. It's just, what can you do to help us mitigate the damage once it's happened. 

Wendy Battles: Now, that's great advice. Thinking of it from that perspective, that it's just something important we should do and not be so worried about that if people have that in the back of their minds, which some might. So, Jeremy, I'd love to dig a little deeper into this idea of security incidents. You define them for us and what an incident might look like. Some of the different manifestations of that. I'd love for you to share with the audience when we should consider an incident to be urgent versus non-urgent. What are those things that we should drop everything and do something about this immediately versus not? 

Jeremy Rosenberg: I would actually step back and say that the important thing is that you're going to need a security expert to make that determination. So, I would encourage people not to try and look at this and go, “Well, this is urgent. This is not urgent.” We're going to look at the sensitivity of the data and we're going to triage it accordingly. But let a security expert make that decision because you'd be surprised the things that you don't think of. You're like, “Well, somebody has access to this folder. Big deal.” Our security engineers are trained to think like the attackers and think about what could they actually do with that access that they have. I'm going to sidestep your question and say, don't worry about whether or not it's urgent or not. Just assume it is and we'll let you know otherwise. 

Wendy Battles: I think that makes so much sense, and I appreciate you sharing that distinction. Plus, you also pointed out something that's happening behind the scenes, that there's this whole team of people in the information security office. And part of their job is to research these incidents, to look at them, to analyze what's going on and to make that determination, so that it's not just all these tools that we have which are important part of this protection that we provide, but it's also the actual people in the office that are doing this work to try to understand what's going on. 

Jeremy Rosenberg: Yeah, and they're forever refining their processes and deploying new technologies. We are in the process right now, when you click on the phishing button in the email, if you get a phish and you report it, right now there's actually an analyst who looks at that and does an assessment of it. We're actually implementing automated tools to do some pre-analysis so that we can actually start to get people responses almost immediately, that sort of say within a minute that says, “We've done these assessments on it. Thank you.” This is not a phish— We're not going to say it's not a phish. We're going to say, “This is a harmless link. Don't worry about it.” That'll very quickly sort of filter it out so that our analysts can be looking at only the ones that the automated tools believe could be dangerous. So, we're constantly refining that. Every time you send us a report of some kind, it goes into helping us improve those systems. So, even if it's a nothing burger, it still helps the team get better at what they do. 

Wendy Battles: I think it's such an important point too again for the community when we're not sure to report those things because sometimes it's hard to tell. Sometimes it might be really obvious this seems malicious, other times we don't know. So, I appreciate that people reporting it. And I like the automation that's coming because I also think it's helpful for the community to be able to get a response really quickly to know, “Oh, this was nothing, but thank you very much for reporting this,” because I think it helps to reinforce that behavior, that it's important to reinforce things even if we're not sure, or maybe especially when we're not sure. 

James Tucciarone: Exactly. 

Wendy Battles: I think a great corollary to this is that's great when people report things and we're encouraging people to report things, they're not sure but report it anyway. What happens if we don't report suspected incidents? What are the ramifications? 

Jeremy Rosenberg: There are a few different ways that can become problematic. First of all, if there is some kind of a compromise, when somebody compromises a computer or an account, they do what we call a pivot. So, they're always trying to pivot to something more desirable, trying to escalate their privileges to something else. So, even if all they do is get a hold of your computer, they're going to take control of your computer and use it to try and access the finance system. They're going to use it to try and access clinical data. So, ignoring a first sign of an attack can just allow it to grow. The first thing that our team is going to do when they get involved is we're going to do an analysis, but we're going to contain the incident. So, make sure that whatever system was compromised, it doesn't go beyond that. If you don't let us know about it can really start to spread before we can get a hold of it. 

The other problem is there's a legal obligation if certain types of data are exposed. This is a good thing. The government is taking this very seriously. If there's student data involved, we need to notify the Department of Education. If there is patient HIPAA data involved, we have to notify Health and Human Services. We have to notify the State Attorney General under many different circumstances. The thing about those notifications are they come with a time limit. So, like, 72 hours after you become aware that data has been compromised, you have to notify whatever agency is involved. 

Well, that clock starts ticking when anybody at Yale knows about the compromise, not when the security team knows about the compromise. So, if you know about it and don't tell anybody for two days, you just burned 48 of our 72 hours. If we don't report on time, then there are significant fines. So, we're talking big money that it could cost Yale if we sit on an incident without addressing it. I don't mean to be melodramatic, but you never know where it's going to go. Those are some of the consequences of not notifying. 

James Tucciarone: So, Jeremy, I'm glad that you talked about some of these really high-risk data types or some of these high-risk elements of our data. When would you say that a security incident becomes a breach? 

Jeremy Rosenberg: When a lawyer says it's a breach. That's not the case everywhere, but at Yale, only the Office of General Counsel can declare a breach. So, a breach means that some kind of regulated data has been exfiltrated or exposed from the university data that has some kind of regulatory requirement behind it. So, there's a law that says this data needs to be kept private. Once legal declares that something is a breach, then we have to treat it as such. That's when all of these very important legal timelines kick in. Now, what will happen is that if there is a major data breach, then there will likely be litigation. People will want to sue, potentially because their data was not properly cared for or something. This is a sort of worst-case scenario. When that happens, all of the communications around it will be discoverable. Any emails you sent to your friends could show up in court. 

So, that's why you want to let the security team know and then just keep the information to yourself while we do our investigation. We're not trying to hide anything. Yale has always been very good at following the laws. We, as an institution will own up to whatever security compromises we have because we know we're doing our best, just like we know you're doing your best. But we do need to follow the processes we have in place, and that's the most important thing. Frankly, that's what regulators want to see. That's what the people who entrust us with their data want to see, that we're doing our best and that we're following the steps that we set out for ourselves and we're making good on the commitments that we have promised. 

James Tucciarone: I picked out there too that you said "keeping quiet," and I'm sure that there's practical and legal reasons to keep quiet. I'm curious, what can or can't we say to our coworkers or to other people if we feel that we are part of a security incident? 

Jeremy Rosenberg: It really depends on what your role is. If you're a person who's responsible for a system that may be experiencing a security compromise, it can be really awkward for you and I get that. We're going to ask you to just sort of make a blanket statement that your system is unavailable for unscheduled maintenance or something like that. It's going to be important that you not say anything about it being a security incident. People are going to get frustrated with you, like, “Why can't you keep your system up? What's taking you so long?” We're going to kind of need you to take one for the team and just hold that line. If it's your own personal device that we've taken away, then it'll probably be okay for you to include your supervisor in the conversation. 

You really should talk to us before telling anybody anything. But you mention it to your colleague, your mentions it to somebody, and then all of a sudden it gets out and the newspapers are calling and that'll happen eventually. We have a very good public relations team who will handle the public relations for the incident, but we don't want to make their job any harder. That's kind of why we say, try to just keep a tight lip. You become an extension of the security team at that point. And the security team needs to be very focused on the investigation, on containment and remediation and not bring a whole bunch of extra people into the conversation. Pretend you're like a secret agent at that point. 

James Tucciarone: I love that. 

Wendy Battles: So, Jeremy, talking about this whole idea of reporting incidents, doing it promptly, really engaging our community to ultimately keep our data and systems safe and secure. I'd love to know if you could share an example. When someone reported an incident promptly, or perhaps they didn't report it promptly and it made a difference. Do you have any examples you could share about how impactful that is? 

Jeremy Rosenberg: I had an interesting incident once. It was actually at a different institution. We had several reports actually, of accounts being compromised. So, people noticed that somebody was using their account and it wasn't them. They were seeing weird emails show up and we had two or three of them and they reported them right away. Our analysts were able to look at these accounts. The first thing they realized, that all the accounts that were getting compromised. Their name started with A or B, which sort of tipped them off to think that. That makes you think that somebody's going through the alphabet is the first thing we think. They were able to use that to look at the authentication logs. 

What they noticed was that there was somebody was going in order of alphabetical order and trying all these different accounts and hitting every so often and getting one correct, which led them to believe that there was a list somewhere out there of accounts with usernames that this person was going by. And sure enough, they went to the dark web and found a leaked list of credentials. It wasn’t from the organization I was at. It wasn't their list, it was an external website, that had had their usernames and passwords compromised. Some of those usernames and passwords were the same ones that were being used at our institution. That's why there was only some hits in the list. Well, once we had that list were actually able to then run a script and very quickly see which ones on the list would have worked if the attacker had gotten to them. And we quickly locked those accounts. 

So, as a result, we were able to prevent everybody from the letter F on down from getting compromised because A, B, and C were quick to report. So, I think that's a good example of where for whatever reason our systems-- it was the specific way this attack was happening. Our detections did not detect that they were making all these attempts because they were getting technical about it. They were basically changing their IP address every four attempts or something. So, it wasn't triggering our detections. But thanks to the community responding really quickly, we were able to get the information we needed to get ahead of that one. So, I was quite proud of the team that day and of the community for letting us know right away and giving us the information we needed. 

Wendy Battles: That isn't a commercial for why we should report right away, why it's so important because sometimes we don't know the extent of things, we don't know what it means. So, being able to inform people about this so that they could research it, I see how that makes a difference. I see how what is a relatively simple action to report something that seems unusual can have a significant impact both for ourselves and also for other people.

Jeremy Rosenberg: Because you don't know who else is seeing this. That was the point of that one. The fact that we had four or five that were in the same pattern, that was the difference maker because we get a single compromised account happens all the time, not so much since MFA, but at the time it was a big deal. But when we see a pattern, then we know. And so, that's the takeaway there. 

James Tucciarone: So, Wendy, I don't know about you, but this has been incredibly insightful. I've definitely learned a lot from this. As we wrap up, Jeremy, I want to ask you one final question for our audience, and that is, what would you like the audience to take away from this? 

Jeremy Rosenberg: Well, obviously, I want you to notify the information security office when you believe there is a security incident going on. But I also want you to make sure that you provide as much detail as you can. Some of the important things to think about are, what time did this thing happen? What time did you notice it happened? What are you seeing? What kind of data is involved? Don't make a lot of assumptions about what is or isn't important. Just give us all the information you have and please try to stay available. So, what will happen is we'll get a report with some of the information, then we can't get a hold of the person to follow up. So, I know we're all very busy, but just be available to us. 

And while I appreciate that this is an important message for Yale, this is information that is valuable for everybody, regardless of where you work, reporting security incidents to your information security office, if you don't have one, to your IT department when you're at work, when you're at school, even your Internet service provider. Some big companies do a better job than others of responding. But you shouldn't be ashamed. You can really make a big difference for yourself and your family and for your colleagues by being proactive about this. So, thank you. 

Wendy Battles: That's really helpful. When people are like, “I need to report this,” what is the best way for people to report something, Jeremy, so that your team gets it? 

Jeremy Rosenberg: I mean, email is actually pretty good for us. We monitor information.security@yale.edu 24/7. There is a phone number that eludes me at the moment, but go to cybersecurity.yale.edu. There is a link to report an incident right there on the front page and that will take you to all the phone numbers and all the information you need to quickly report an incident or come running down the hall in 25 Science Park and grab me and we'll figure it out. 

Wendy Battles: [chuckles] I love it. Thank you. And all of that information that you just mentioned is in the show notes for easy access. So, we encourage you, of course, to report incidents as quickly as you can. 

Jeremy Rosenberg: Appreciate it. 

Wendy Battles: Jeremy, this has been so awesome. Thank you so much for joining us today, for sharing your wisdom and this very valuable information about what our community can do, to really expand their understanding of security incidents, why they're so important, why we need to take action quickly, and some of the ramifications if we don't. So, understanding that, I think will help all of us. One, be much more aware. And two, really be motivated to take action to change our behavior so that we are doing those things that will help ensure that we take the best care of our Yale data and systems. And, as you mentioned, also our personal information, when we're outside of work, being proactive like that does make a big difference. Thank you so much for your time today. 

Jeremy Rosenberg: Thank you. You two are always a pleasure to chat with, so I hope I get to come back for Season 3. 

[theme music]

James Tucciarone: Here's the buzz on malware. The term malware is a combination of the words malicious and software, and it's most often used to describe software that's harmful to a device, network or server. As an umbrella term, we can also break malware down into more specific terms that represent individual attack strategies. Let's consider a few examples. 

Spyware is a malicious program that collects data without the user's knowledge. A Trojan, appropriately named after the infamous Trojan Horse, is malicious software that masquerades as being legitimate and desirable. Keyloggers are programs designed to log a user's keystrokes. Returning to our two earlier examples, ransomware is a type of malware that encrypts and effectively locks our data or devices, with the scammer ultimately demanding ransom. A computer virus is software designed to infect other applications on the device, typically to some destructive end. And there's even more. 

The good news is, while it might be useful and maybe even interesting to know the differences between all these terms, we really don't need to know or remember any of them. Knowing what malware is, malicious software of any kind, and how to avoid it is what's really important. So, how do we avoid it? 

Some of the recommendations we've definitely heard before, click with caution, be wary of unexpected emails and never click suspicious links or open suspicious attachments. Use secure passwords, don't reuse passwords across accounts, and use multifactor authentication where possible, and apply updates. Lots of malware is designed to take advantage of known security gaps that are sealed up by these updates to systems and software. 

More generally, it's a good idea to be vigilant in paying attention and making sure we're only downloading legitimate software and apps, and we should immediately report any suspected malware on devices with Yale data to the Information Security Office. Here's one more good idea, keep listening to Bee Cyber Fit podcast, where we simplify cybersecurity and help you to be aware, to be prepared, and to be cyber safe. 

Wendy Battles: Well, this has been a really interesting impactful episode, James. One thing I came away with was the idea that we know our work best. We know if something seems off, we're in our systems all day, we're using our laptops or devices, and if something seems awry, we are the ones that would know about that more than anybody else. Talking about this idea of being suspicious if something doesn't feel right, and trusting our gut, just like we trust our gut with other things we might do. That really resonated with me that I am the person, I have to be responsible, I know my work, and it's up to me to take action when something doesn't feel quite right. 

James Tucciarone: Wendy, I absolutely agree. And I think we are the ones who are most likely to recognize that something wrong because we are the ones who know our work best. And that kind of goes hand in hand with the takeaway that I had from today or the biggest takeaway that I had from today, which is when we asked Jeremy about, "Why we shouldn't be scared or nervous about reporting?" He said, "It's part of our job." Just like we said, we know our jobs best, that's why it is part of our job to make sure that we are reporting things that seem off because we would be the ones most likely to see it. 

Wendy Battles: Absolutely. So, this idea of personal responsibility when it comes to cybersecurity is an ongoing theme, both recognizing that for ourselves, but clearly helping our community recognize that, too, helping them understand where they fit into the picture. I think that this conversation with Jeremy helped to elucidate that for me and I hope for other people as well. 

So, James, let's talk about three calls to action for our listeners based on today's episode. Number one, our Yale community received a message from our CISO, Jeremy, that you heard today about the kickoff of our Bee SAFE, Not Sorry campaign. We want to remind people about that. If you're listening but you didn't have a chance to read the message, we have linked to it in the show notes and encourage you to give that a read. Second, we want you to check out our report and incident page which has lots more information about how to report incidents as well as defining our Safe model. And finally, we have a fun puzzle for you as a simple yet interesting way to build your knowledge about reporting, we've put together our Bee SAFE, Not Sorry puzzle. It's a cryptogram. 

When you complete that puzzle, you can be entered to win Yale Cybersecurity Awareness prize pack. We've got some really cool stuff to give away. So, we encourage you to complete this puzzle, and you have between now and May 31 to do so. So, ample time the rest of this month. 

James Tucciarone: Wendy, I love that all of those things are simple actions that people can take to keep building their cyber muscle. 

And that's all the time we have for today. So, until next time, I'm James Tucciarone, and I'm here with Wendy Battles. And we'd like to thank everyone who helps make this podcast possible. We'd like to thank Yale University, where this podcast is produced and recorded. 

Wendy Battles: And remember, everyone, it only takes simple steps to Bee Cyber Fit. 

[Transcript provided by SpeechDocs Podcast Transcription]
 


Podcasts we love