Cybersecurity Mentors Podcast

Lessons Learned From the Australian National University Breach with Suthagar Seevaratnam - Part 1

Cybersecurity Mentors Season 5 Episode 3

In this episode of the Cybersecurity Mentors Podcast, Suthagar Seevaratnam, a former CISO at the Australian National University, shares his journey into cybersecurity and the challenges faced during a significant data breach. He discusses the importance of addressing organizational trauma, effective leadership during crises, and the human element in cybersecurity. The conversation delves into the details of the breach, including the attack vector, the role of phishing, and the impact of legacy systems. Suthagar emphasizes the need for calmness, compassion, and effective communication in crisis management, highlighting the lessons learned from the incident.

ANU Breach Report

Suthagar Seevaratnam’s LinkedIn 

ACILearning.com/SimplyCyber


Send us fan mail via text

Suthagar:

John, I will never, ever, ever forget that day. It w it it was probably one of the most challenging days of my life. Um certainly in my professional career, I rang the Vice Chancellor, Brian Schmidt. And I said I rang his office and I said, I really need to talk to him immediately. Uh and he got I got a slot in there pretty much within the hour. Yeah. And I remember walking into his office and sitting down and um saying, Vice Chancellor, this is the worst conversation you and I are ever gonna have.

Speaker:

Could you teach me? Then learn fly. Make your rule, Daniel son, not the mind.

Speaker 6:

I know what you're trying to do. I'm trying to free your mind, Nia. But I can only show you the door. You're the one that has to walk through it. What is the most inspiring thing I ever said to you? Don't be an idiot. Change my life.

Speaker 5:

We aren't here to waste your time with buzzwords. In IT and cybersecurity, what you know and what you can do makes all the difference. We are ACI Learning. Training built for novices and pros who need measurable results. Hands-on labs, real-world checks, courses that get you certified and ready for what's next. Build confidence, strengthen defenses, achieve more. Visit ac ilearning.com slash simply cyber to learn more.

John:

Welcome to the Cybersecurity Mentors Podcast. In this ep episode, we have a special episode. We have a guest with us, Suthagar, Siva Ratnum. I think I said that right. Close enough. Um, and he is a CISO and a former CISO for the Australian National University. And he we're going to talk about a very interesting data breach that happened while during his time there, that they did a great job, and I think most organizations should think about this, is just a report of an after-action report that they composed that most most places you don't understand or get the details of what happened. Um but first, let me give him a chance to give us a little bit about his background and and what got him into cybersecurity and and just tell a little bit of his story.

Suthagar:

Thank you, John, and it's really lovely to be here. Um by way of background, believe it or not, I didn't start anywhere near cyber when I started off. And where you start is not where you end. Uh and I started off as an economist, believe it or not. That was my whole, my whole thing was gonna be my whole jam was gonna be finance and economics. So I'm a very, very long way from there. Uh, but I always had this um underlying passion for tech and the sort of transformative components of tech. So I um I always had it sort of underlying as a personal passion. And it wasn't until uh I got to work with the Australian Signals Directorate uh at uh the Department of Defense at the time uh here in Australia that I really, really everything sort of ignited for me. It was a connection between all the technology things I was doing and the security things I'd been doing, and they sort of just gelled. And I was uh incredibly, incredibly fortunate to be around people who um had the same passion and the same drive. And I realized that this was this was probably a more natural home for me. Uh and it was something that I was very passionate about because it's really about protecting people. And uh I realized technology could be a force for good. It could be a force for um uh for doing harm. And I was particularly inspired by the origins of the word cybersecurity. Cyber comes from Greek kerbros. It means to guide and security, Latin securos, it means freedom from care and anxiety. And I thought, wow, you this this whole job is about guiding someone to a place where they are free from care and so that they can do amazing things. And that was such a calling to me. So um uh I end up being the uh CISO at the Australian Bureau of Mineralogy, uh, which is an incredibly complex network. Um and they had just had a cyber breach of their own. And I didn't actually join as the CISO, I joined as the um general manager for IT operations, and but I sort of quickly sort of dove into the, into becoming their CISO. And uh that was my first real taste of um doing a sort of uh, I guess, senior position in in cybersecurity. Yeah. Done a lot of sort of tactical type type things before that. But this is the first time I could see the real big picture, how all the pieces came together, how the technology came together. But more importantly, more than anything else, it taught me it's about people and it's about how the people come together. And it is always a people-centered narrative for cyber. And so again, that was another thing that sort of really resonated with me. Uh so they so from there it was just a natural career of just um, I don't want to sound like I was following crises around or anything, but it was uh from there, I um actually for a brief moment I thought, hmm, maybe I've done enough security, maybe I've done enough for king and country. And um maybe I should just go back to doing technology. And I was all signed up to be a CIO somewhere. And uh the um Australian Cybersecurity Center rang and said, actually, we'd like you to go to ANU. Uh we think they need some help. And really? Student hackers? Really? Like a bit overkill, isn't it? Um uh but I was wrong. I was very wrong. And it didn't take very long before sort of you know events unfolded. It was it was only about six months after I got there.

John:

Um you you were there, you're their first CIS uh SISO, CISO, right?

Suthagar:

I was there in normal SISO, yeah. I was walking into I never worked at a university before, I never even stepped foot in uh on on campus in in such a fashion since finishing my degree. So um it was it was interesting. It was uh for the first few weeks I wasn't entirely sure what I was doing was doing, but um uh there was a natural rhythm uh that sort of eventuated and um the rest is history, as they say, but I'm happy to dive into that.

John:

Yeah. Well, let me ask you this. How did they, I mean, as their first CISO, how did they, you know, were they happy you were there? Or there's it like, oh no, there's a new sheriff in town. Like how was that?

Suthagar:

Yeah. Um it was an interesting mood uh because I came on the back of a breach they'd already had. Okay. And uh I must have been the only person in the country that didn't know about it. So um I I was so fixated on the job I was doing um back at the bureau. I I didn't really pay a lot of attention. I just um it was uh it was an offer made to go there. But I didn't have a lot of the background of what what was happening until after I got there. And it was a very somber mood, really, really somber. Uh uh people weren't talking about the breach. Um that was the first thing that really hit me. Uh nobody wanted to talk about it. It was this sort of veil of secrecy. Uh, it wasn't being allowed to breeze. And um, you know, I spoke to some colleagues uh who were involved in that incident, and I said, Was there any reason we can't talk about this? And they said, Why do you want to talk about it? And I said, Well, breaches are like an act of organizational trauma. And you to heal from trauma, you need to talk. You can't sit there in silence. You need to talk about it. And you need to let it air, and you need to let ask questions, and maybe they take you to some uncomfortable places. And maybe there are some things you can't answer for good reasons. But in the main, you should acknowledge that it happened and where you can go forward from there. And I was in the process of getting that sorted. I was in the process of saying, we get we are going to talk about this and we're going to open this up and we're going to have a conversation about this. And um, I I was writing up a process to do that for the first for the first breach. And part of my wherever I go, the first thing I try and do is baseline the environment. I want to know what I'm dealing with. And I bought in a couple of externals. I had no staff at the time, so I bought in um some external companies to just do a baseline for me, like a baseline threat hunt. Tell me what I'm dealing with here so that I can understand what to do next and what strategies I need to put in place. They'd done some audits, they'd done some post-action reviews and all the rest of that. And they were very, very basic things that they were going through, like really fundamental. I shouldn't say basic, but there were fundamental things that should be done, putting in controls or compensating measures and a whole bunch of things that um, you know, really needed to be put in in place. And they were making some progress against that. But I I wanted to know in real time, what was I dealing with and where did my, you know, where does my first, second, and third priorities need to be.

John:

Yeah.

Suthagar:

Well, as it so happened, during that throughout hunt, they found the second breach, um, which was not what I was expecting. So uh so I kind of shelved the ideas about talking about the first breach and um leaned very heavily into um what was unfolding. And I will never, John, I will never, ever, ever forget that day. It w it it w it was probably one of the most challenging days of my life. Um certainly in my professional career, I rang the vice-chancellor, Brian Schmidt. And I said I rang his office and I said, I really need to talk to him immediately. Uh and he got I got a slot in there pretty much within the hour. Yeah. And I remember walking into his office and sitting down and um saying, Vice-Chancellor, this is the worst conversation you and I are ever gonna have. And I remember opening with those lines. And uh he's like, so he's looking at me and I said, Well, we've discovered another breach and it's bigger than the first one. And I think they're still here. So um uh and I said, Look, we we we are we've split the team into two. Uh one is going to go and look at w how they got in, and the other one is going to look at where they were going and what they're currently doing. So we've split our sort of selves up. We had enough people on the ground to do that. Um so that's what we did. We we started.

John:

Sorry, was that external folks? And yeah, that's all I had at the time. You just had external the people you had brought in.

Suthagar:

Yeah, just to do the strat hunting. And we brought them in. Um they had the expertise. Uh they had some good reach back to the US as well. So they brought in a couple of people to help. Uh and I had um a couple of people seconded from the IT area as well. Because uh we need people who knew the local environment. Uh so that was uh I called in just about everyone I could uh that we could lay our hands on just to start processing this out and start managing this as an investigatory exercise. It wasn't even an incident response. It was trying to work out what are we dealing with here. But I remember also leaving and this is really the human story that isn't in the report. Um I remember leaving Brian's office and uh I was walking past the main library building, uh, and I just needed to clear my head. And and it was around uh sort of mid-semester exams. And so there were a lot of students around, you know, looking really, really, really stressed out as students do, and you could see all the flurry of activity. And you know, these are the they're young adults, um but young adults for the most part. And it hit me viscerally, it hit me thinking who would want to attack these innocent people. You know, nation states, they they sort of have a go at each other. That's that's since time immemorial, right?

John:

Yeah.

Suthagar:

Um and and the threat actors and and c cyber criminals will go after you know targets of opportunity and and and um targets to give them revenue. But here you've got students. Here you've got people who are just trying to make their lives better. ANU, like many universities, had a really strong international cohort as well, people who'd come from all over the world. And it s sort of struck me that these who would attack these people. It was sickening to me. It just felt sick. And I I almost started to feel ill. Like I was I I was um almost on my knees uh because I was I was short of breath. I was thinking, who would do this? Who could do this? Right. And and uh I had to sort of lean against a tree just to just to grab myself um from from falling. Uh because it's just the enormity of that hit me.

John:

Um did you know how much at that time, I mean, it sounds like you knew how serious it was, but you didn't know the scope or the impact yet. But you had a feeling.

Suthagar:

Uh so there were some initial indications, and I want to stress it turned out to be false in the end. Okay. Okay. Uh it was a bit of a false lead. But there was this, there were certain indications that um some of the people who were affected were vulnerable students, you know, students with possible medical issues or financial issues or things of that nature. Okay. Um and so they were the most vulnerable among us. And to me, that's like kicking a puppy dog, right? It was it was awful. It just felt it just felt so vile that anyone would do this. And uh now we we know subsequently that that that wasn't true. That it wasn't it was uh that it was there was an account that was associated with that area um that was one of many that was compromised, but it wasn't it wasn't unique to that area, nor were they targeting that specific area. So um, but at the time we didn't know very much. It was still fog of war, um, very, very early days, but just uh it's natural to go through a million what-if scenarios, right? Definitely. And then you've sort of got to let yourself do that and then let it go because it's idle speculation.

John:

Uh but that's what I bring back. Yeah, well, let me ask you this. With the vice chancellor with Brian, how what was his response? You know, I know obviously it was gotta be again.

Suthagar:

We just No, no, okay. I was watching him really intently. We hadn't known each other that long by that stage, like six months. And um, look, if you if you spoke to him for 10 minutes, you would realize he's such a um self-effacing, humble human being, you know, this incredible intellect, but at the same time, carries it very lightly. And uh he he just listened more than anything else. He listened, he asked some some questions about what I knew and what I didn't know. Um he didn't seem shocked to me. I think if anything, the emotion that I saw on his face was sadness. Um he's a really compassionate and empathetic person, and um it it uh what I saw was sadness. Uh later on I saw a lot of anger from him that like like um uh towards the threat actors, like he was really outraged. But um at that time, I think he was just really sad for what had happened. And uh and he tries to always uh you know put on a very professional face as well. So uh especially when you lead with the eye with the with the words, this is the worst conversation we're ever gonna have. Um so uh he um he took it in his stride. He wanted to know how I could be supported. Um he said, you tell me what you need. Um it's there for you. Uh and we will do whatever you need to need us to do. And I said, right now I need to find out, I need to ground everything in truth. I need to find out what is and what isn't. There is no point speculating about this sort of stuff. We just need facts as much as we can get them, and then make some educated, we're not gonna get all the facts, but we're gonna make some educated calls, and then we're gonna move forward into an incident response, and then we're gonna move into compromise recovery, and then we're gonna make sure that they're, you know, we make sure that they're evicted and make sure that we're secure. That that's what we're gonna do so that I can turn around to you in however long and say we have regained control of our network and you know, we're satisfied that we've we've removed that threat. Um and then we can look do the postmortem on what happened and how it happened and all the rest of it. So he he understood that really well, uh, marshalled people together to support me uh at a senior level. Uh and one thing that's the other moment that really hit me. Uh I I went around briefing all the seniors and uh the ones that uh Brian had nominated to be part of this. And they pulled together their crisis management team. And I briefed them and explained we were giving you know regular updates about what was happening, what we were doing, how we were going to do it. And I remember the deputy vice chancellor for academic um uh uh Grady Venville. Um she and I were walking out, she both of us were walking out of one of these meetings. It was fairly late. It was probably um you know early evening. And she turned around to me and she said, just off the cuff, she said, it feels different this time. And I said, Well, I wasn't here last time, so what what what do you mean? Well what's different about it? You know, I I was kind of expecting a sort of speculative about, you know, the the the you know, maybe the motivations or something. And what she said was beautiful. She said, um, it feels like we're in control this time. And I said, that's the way it needs to be. We control what happens from here on in. This is something that has happened to us, we are the victim. But we are in control and we will regain control and make sure that whether it's the external cons, whether it's how we message internally to our to our staff and to our students, how we handle the incident, this is in our control. The threat actor doesn't get to dictate that. So you know, you're right. We're gonna be measured, we're not gonna panic, we're gonna do this methodically and uh we're gonna maintain uh focus on every stage. Right? There's a lot of activity, a lot of frenetic activity that under the hood, that is leaders, let's just stay really focused. And um, Grady is one of those incredibly unflappable, graceful human beings. And um, yeah, she she when she said that to me, I thought, yeah, that's good. Yeah. Um so they're they're two moments that really stick out for me in in in in that whole um chain of events.

John:

That is uh good. Those are great. And we talk about we've talked about in our podcast before about leadership under duress and stress. And you had a previous, you know, where you've been through kind of the aftermath of another breach. But what do you think helped you just be able to handle? I mean, it's a I I've been through these situations not as as the scale that you have, but definitely you have the kind of the cold sweats moment, okay, this is this is for real. And just being able to be calm, steady, measured, you know, uh controlled, but also empathetic, like those kind of things, is there are there things that helped you be that way, or you know, you what helped you in that situation?

Suthagar:

Yeah, I am uh I remember going to um RSA, you know, the conference RSA, and uh I went to one of their CISO boot camps, and I remember uh CISO from a very large American firm saying, you know, one thing you've got to remind yourself of all the time is you will have bad days. In this job, you will have bad days. And I still have this notebook somewhere on my bookshelf, and when I've written in really big black texta, you will have bad days. And then I cross the days out and I put in weeks, months, possibly years. And uh I I remember looking at that so many times, thinking this is one of those bad days, possibly months, right? And there's no point in panicking if it doesn't serve you well, you have to remain calm, you have to remain collected because there are people who are going to throw emotion at you.

John:

Yeah.

Suthagar:

Right. As soon as we turn around to our stakeholders, our students, our staff, they're gonna be angry. They're gonna be confused, they're gonna be upset. There's no point you having any of those emotions or taking or reflecting any of those emotions as your own. And uh uh you have to be the calm center. And the people around you have to be the calm center. And it's really, really important that you um you have that gravity uh around you, you have uh that very quiet dignity that this is something that's bad that's happened, but we're gonna deal with it. And we're gonna get through it. Yeah. Some way we are going to get through this and we're gonna get through it together. And we're never gonna lose our compassion because that's what differentiates us from the threat actor. And it doesn't matter who they are, in my humble opinion, they are cowards. Yeah. They're bullies and they're cowards and they hide in shadow. We work in daylight. And that is a different that's the difference between us and them. And when you work in daylight, you have to be seen as someone who's guiding, as I mentioned before, someone guiding someone to a place free from care. We will get there. This has happened to us, no one asked it, and we it's nothing that we did to deserve this. Um, we're still the victim. You know, uh, there's a great um saying, weakness is not a r is not an excuse to be attacked. And we're not weak, but somebody did attack us. And now we need to reclaim what we lost. And you don't do that from by panicking or by showing anything other than compassion. And yeah, that that was a that's a firm belief. And you've got to hold that for yourself. You've got to be compassionate to yourself and to the team because they're doing the best they can. They're gonna miss things. We definitely miss things. There's gonna be plenty of scope for miscommunication because you're in that sort of uh modality. There's there's plenty of misinformation and misdirection because of the nature of the threat actor. When all of that is going on, calm, level, reset when you need to reset. Um make sure that you've got the energy management of the team, really important. Um, the the team welfare. Uh look, I've I've been in a couple of other sort of cyber incidents, not as not as major as that one, but the most I could do is buy them pizzas. Right. Because make sure they got home or all right after one. Like that that that's that's my value add because they know what they're doing. And um, your job is to make sure that they that they're enabled. Uh and um so when you so that's the sort of mentality you have to do you have to build. That's the muscle memory you have to build. Um anyone in my family would tell you I'm not like that in in in in any walk of my life. But uh um uh but when you're in that mode, in that mode, you switch, you switch, and you and you um um yeah, you have to be that calm, steady voice.

John:

Yeah, excellent. Um so from there, you you guys are you're diving in, you're finding out, okay, yeah, this is bigger, and and and and probably it's like pulling back the layers. Oh, there's more, oh, we've got these pieces. So if you want to talk about that a little bit, of like you're starting to connect the dots, and like you said, you know, there's stuff you're gonna miss, but you're also oh, understanding pivot points. And and the one one of the things, you know, kind of bringing that back to the I'm really curious about the email that was sent. I think that that stood out to me as interesting. I have a hunch on maybe what was related to, but I won't I won't I will let you explain it. Um so but start from like kind of peeling back the layers, if you could, to back to, oh, this came from an email.

Suthagar:

Yeah, so I remember us having a whiteboard uh in my office and we were draw, we were mapping everything out on this whiteboard, right? And uh because of the um network forensics that we had in terms of following the sort of breadcrumbs that they had left, we uh discovered that all of this started with a phishing email. Um because it was one of the places that we looked. Uh you know, we're mail servers, it's just a natural place. It's a vector, we know this. Um good hunch paid off. Uh but it's it's it's one of those places you should look when you're when you're when you've got a lot of ambiguity, right? A lot of a lot of breaches, you know what's happening, you know the vector, and you're and it's sort of a bit contained. This was not like that. This was happening three or four months after they'd gotten in. And so we weren't sure just where to start looking and what to start looking for. So it was following the breadcrumbs we found this. That's a false lead. We went back, we started looking at a few other things, and we correlated different pieces of information. At that time, the university didn't have all the things that it has today. We didn't have any of the sort of coverage that we had, that the university has today since all the investment that was put in there. Uh, it's it's a it was a very different, very, very different environment. Uh, very decentralized. Um of the research schools had their own little sort of IT bubbles. And so it took us a long time to map out an environment that we were not familiar with, I certainly wasn't. Uh, and using local knowledge to say, well, let's look here, let's find a breadcrumb here, let's look over here. And we sort of slowly stitched that picture together. Uh and once we've gotten the sort of signature, I guess it's the wrong word, but let's like say the footprint that that the threat actor was using, it made it a little easier for us to follow them around. And we were able to map all of that out, which you see in the report. Uh but one of those sort of tendrils uh led to an email. And it was uh the email of someone um fairly senior. Um I won't say whom, it's it's not fair to them. And it was an email that was forwarded to that person. So they weren't even the original recipient um of that email. Interesting. Um it was forwarded. And then like somebody else had deleted it. They didn't open it or anything. They just thought they thought it they should it should go to this person and they sort of just deleted it. But then that person had an EA. And uh all they were doing was they were previewing those emails. But it was enough when you when you preview and it renders, it's enough to trigger the code. So people made my bad. People made a lot of s hoo-ha about the, you know, it didn't get clicked on. And they all sort of jumped to, oh, it's some kind of new vector that doesn't require you to click on an email. Well, not really. If you preview the email, it doesn't register as being clicked. It doesn't register as being opened. It just registers as being previewed. And but it was enough. It was enough to render uh the malicious code that was embedded into that email to do what it needed to do. That's all there was to that. And we didn't have at the time, we didn't have some of the email controls for macros. We didn't have them, you know, macro blocking at the time. And that's all that there was. There was uh somewhere in the in the past, someone had turned off macro blocking on mail. Yeah. And that's all you needed to do. There wasn't anything particularly unusual about it. Uh that's why it looked like it hadn't been clicked. And that's all I was trying to talk about in the report. I probably could have wrote that in hindsight a lot better. I guess I got a lot of questions on it.

Speaker 7:

Sure.

Suthagar:

But all all it wasn't it wasn't some weird new thing. It was, you know, hover over the email, it renders, it opens. Uh, and that's all that's all that was needed.

John:

Aaron Ross Powell And not even an attachment in the email. It was the email, just the email itself, right?

Suthagar:

Yeah, it was. It was I think there was something from recollection, it was there was a a um a Word doc that was also had malicious stuff, but there was there was it was actually embedded into the body of the email. So as soon as it's rendered, as soon as it sort of rendered itself, it started. That that's all that was required. And it was uh like the the tech the techniques uh are out in the open now, right? And um uh there's a really good, really good write-up if if um of our attack uh by Dr. Richard Gold, who was working at Digital Shadows at the time. Uh he mapped our entire attack chain and he mapped that against MITA and he explained how that happened. And I couldn't thank him enough for it because we couldn't do that. Not because we lacked the skill or anything. We were writing for a very different audience. We were writing for a bunch of students and staff members who were not cyber people. We weren't writing for we weren't writing for the general public or for cyber researchers. We were writing for our our stakeholders and and our community. And we wanted to make it readable to them and we wanted to make it easy to understand because they had every right to know what had happened to their data. And uh so it was written in that way. I assure you, the original, the first drafts of that thing were uber technical, and we we massaged it, massaged it, massaged it. Uh and until it became uh I remember the head of HR at the time, she read it and she said, um, hey, I understood that. Score, excellent. That's right. That's right. Um uh and so we were writing for that audience, and and and Richard took that and reversed it out and mapped it against SMITERE. And he said, Is this what happened? And I said, Well, yeah, 90% you got 98% of it. There's only a tiny bit there that um you know we can't talk about. But um uh uh for for privacy reasons we don't want to we don't want copycat attacks or anything. But um you got it. You got it. It was um uh you got it in one. And uh after that, I I I should explain how that report came about, but uh before I do, that's what happened. That's that's what happened to email. That that's um that there wasn't much more to it than that. It was a very innocent thing, somebody was doing their job, um, the controls should have been there to block it. They weren't there. Uh I'm sure you they're there now. They weren't there at the time. There's nothing unusual about it. It didn't require a click, didn't require anything else, it was enough. Uh and one of the things that it sort of rendered because uh it was some one of the servers had been previously compromised, like inside the environment, it had already been like it was a web server, and they'd already put some of their tooling down on it. So when that email opened up, all it was doing was calling back to something that was already on campus. As far as it was concerned, it was a it was a trusted, you know, it was a trusted source, it was internal. That's all it was doing. Um so there wasn't there was no magic to it, but I I found the speculation quite fascinating that people kind of went in a very different direction. Uh I hope that answers your question.

John:

No, no, no. No, no. I mean the well, the thought the thing that I thought about was, and I couldn't remember what the timing was, where there was an attack type where you could send out or extract hashes, like uh NTLM hashes from from e through email. And it would go out through your perimeter if you were not restricting like SMB or you know, that kind of act yeah, that kind of activity going out. But it sounds like you're saying that they did even do that.

Suthagar:

They were sending it already to an endpoint, maybe internal or there was there was there was a there was some there was a callback to an internal server that was already compromised. Um that's from a from a previous attempt. So we know that they started much, much earlier than that email, but they only got so far. And then they use this email to do uh you know the main part of their attack. So there were many stages to this. And we had to kick sort of forensically going back in history to find the first sort of telltale signs. And I remember um uh one of the very first things that uh happened when I was at ANU well before this particular attack was something I thought was innocuous at the time, but turned out to be related. Someone had compromised at the parking infringement server, uh, the thing that sort of handed out parking tickets.

John:

And so the notorious parking server.

Suthagar:

It's internet facing, right? And I thought, ugh, okay. We need to lock that down, like fixer, I'll have to offline it for a bit. Nobody was complaining about me offlining. Um but we now realize that was one of the first things that they hit, uh, because it was vulnerable and had some exploit against it, and they were able to park some of their tooling there. And clearly it had the ability to then link to the email that they were doing before. But it also um it also did do that SMB um um outbound uh reach. Uh it did do that because the fireballs at the time weren't blocking that. So they were able to they were able to do the the um exactly what you just said.

John:

Yeah.

Suthagar:

So a lot of things were happening. A lot of things were happening. That was it was the they were trying a lot of different techniques. Uh one of them, you know, unique, apart from one, was a little unique, but the rest of it um, you know, we'd seen elsewhere before. And that's what I think the sort Richards analysis was really good because um he was able to say, well, hang on, we have seen these sort of attacks in the wild, and all they're doing is they're chaining all of these things together.

John:

Yeah. Well, let me ask you about the spearfishing. Well, I mean, it seemed like spearfishing. Uh and there were multiple phases of spearfishing in in the report. But at least the the initial was it very targeted? I mean, you said it was forwarded, but was it just like but how many emails were sent? Was it a lot? Or was it? No, no, no, no, no, no.

Suthagar:

So uh what what happened? And and I have to just be really careful because um I I don't want to sort of disclose people's names or anything, but what happened was it was it was sent to someone who was affiliated with a completely different organization.

John:

Aaron Ross Powell Okay.

Suthagar:

But but you know, you know what universities alike, right? That there are people who have multiple affiliations, it's not like an employee type relationship.

John:

Right.

Suthagar:

Um it was sent to a particular individual who had uh a rightful, normal c connection with this other institution. And it was sent to them.

John:

Okay.

Suthagar:

Uh purporting to be from that other institution.

John:

Gotcha.

Suthagar:

Uh or or talking about that other institution. And whatever it was in that email, in the contents of that email, this guy just didn't he didn't he didn't read it. He just went, oh, yeah. And that's all they did.

John:

Okay.

Suthagar:

I don't think that threat actor was trying to target the person who ultimately or the office of the person who ultimately opened that email.

Speaker 7:

Yeah.

Suthagar:

It was sheer dumb luck on on their part that it ended up with that person. Uh they they were trying to target something else. They were trying to uh and then I think they realized, oh, wait a minute, we've got an opportunity here. Uh we can go further. Sure. I mean, I'm speculating. Or it could be I was there was just the first wave of a much broader campaign that they didn't need to do in the end because they got what they wanted.

unknown:

Right.

Suthagar:

It could have just been that. So I don't want to make correlation causation, right? It's not that's not that's not uh uh it would be very specious thinking to think that. It it it because we saw other emails afterwards, we saw other phishing emails uh afterwards as they were trying to do credential harvesting. We saw that. So clearly phishing is a weapon of choice for this actor. And uh they were purporting to be anything and everything. They we put a few example emails into the report. Uh that there's no thematic connection. They were just finding emails, real emails, right and mimicking them. Weaponizing those. And weaponizing them, right. So uh they were just looking for credentials. They were just looking though there's a spray. That that's what they were trying to do uh at that stage.

John:

Yeah. Okay. Um so and as they, you know, as you kind of piece this puzzle together, all right. They they got these credentials and they start trying to utilize those credentials. There's one another thing that stood out in the report about just them setting up base. There's a couple of attack stations that they set up. Um, one of those I thought was a good technique for them was you know, setting up their tools on virtual machines that they, you know, spin up on top of the main whatever it was, server. Yeah. Um, so that it makes it difficult. I and I don't even know, you know, did they have did that server have uh defense tools or defen EDR type tools? Oh yeah. But even then it made it it's a good idea from their perspective and very difficult to detect if you're running an a virtual OS on top of the main operating system.

Suthagar:

Aaron Powell Yeah, if you don't have network access controls or things uh uh signaturing you know new devices as they pop up into your network uh or any other sort of you know, layered defense around that sort of stuff. No, it's it's a very, very good technique and it also makes it very disposable. You can spin up something that's got all of your tools all there already at the same time as well. This particular I I think it's called attack station A in the in the report.

John:

Yeah.

Suthagar:

That particular attack station was a server, and it was part of a project that was supposed to be decommissioned. It was a trial that was being run. Um can't quite remember what it was about, but it was it was just a it was just a a trial that people were running at the time and it had run its course, but I thought they hadn't turned it off. Uh and it was running fallow there. It was it was really out of date. Um it had just been forgotten about, they just forgot to turn it off after the pro after that sort of trial ended.

John:

Yeah.

Suthagar:

So it didn't have any of a sort of uh EDRs or endpoint protection or anything like it. It wasn't locked down in any way, shape, or form. It was a sandpit. It was designed to be a sandpit uh for that particular trial. And um, yeah.

John:

They they like you said, dumb luck, they found they found that system.

Suthagar:

Yeah, I I I guess if you're looking uh you know, it's an interesting hypothesis, uh particularly to universities. Somewhere in there is legacy infrastructure that you could probably own, right? And um it's a good bet. Universities have a long legacy tail, right? They're built for research, they're built for then they're not necessarily designed to be like corporate IT environments. They're they're they're designed for something completely different. And um you know, I think we've moved beyond that now. We've moved to a far better set of models around this. But at the time, it was more for convenience over security. And it's still a challenge.

John:

It's a challenge for for us. Like we still find old legacy systems that we're trying to go and and find lockdown, turn it off. Do you need this? Oh, it's for my research laser. It can only run on Windows, whatever. And you're like, oh my goodness. Okay, how do we isolate this thing? Right. Um, it it is it is a real challenge.

Suthagar:

I still remember um explaining this to Brian uh originally, and he said, Oh, you've got to understand, like some of this, some of our network reaches back to the dawn of the internet. Like some of the first DNS servers were here. Yeah. So some of this belongs in a museum, is what you're telling us. Yeah. Um it's got historical value. Uh but uh it you know, the the this thing was woefully out of date. It was woefully out of date, this this particular thing. And uh because people had forgotten about it. Uh and I remember that you know there was there was this point in the attack cycle where the the threat actor had to patch this operating system because their their tools and whatever exploit they were trying to run wouldn't run on that version uh of the OS. So they patched it uh to get it up to a certain uh a certain spec in order to run their tools and and and and what have you. So um moral of that story, even even threat actors patch.

John:

They gotta keep their tools running.

Suthagar:

Yeah, you gotta keep it current. Yeah, that's right. So it was, I think it was, again, a fortuitous find for them. But you're right. What was really interesting is they made that their base of operations. They had they they're they're, you know, long before we coined the phrase living off the land, they really were living off the land. And that they were, they were owning that box and being able to spread out from it.

Steve:

Wow, what a story. I think that's a good place to pause for now. We've only scratched the surface of what happened during the ANU breach and what it taught the cybersecurity community. In the next episode, we'll dive deeper with Soothagar into the lessons learned, how the team rebuilt trust, and what every security leader can take away from this experience. So make sure to subscribe wherever you listen to the podcast and check out the next episode for part two. Until next time. And a huge thank you to our sponsor for season five of the Cybersecurity Mentors Podcast, ACI Learning. You can check out ACI Learning at acilearning.com/slash simply cyber. Thank you for tuning in to today's episode of the Cybersecurity Mentors Podcast.

John:

Remember to subscribe to our podcast on your favorite platform so you get all the episodes. Join us next time as we continue to unlock the secrets of cybersecurity mentorship.

Speaker 1:

Do you have questions or topics you'd like us to cover? Or do you want to share your journey? Join us on Discord at Cybersecurity Mentors Podcast and follow us on LinkedIn. We'd love to hear from you.

John:

Until next time, I'm John Hoyt.

Steve:

And I'm Steve Higaretta.

John:

Thank you for listening.