Tech Insights with Alisha Christian

Phishing 2.0: When Misinformation Becomes a Cyber Weapon

Mercury IT

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 34:45

A single believable story can do more damage than a thousand dodgy emails. We sit down to untangle misinformation vs disinformation and why that difference matters when you’re trying to protect people, money and reputation. From clickbait headlines to “credible” reposts, we talk about how false information spreads across digital platforms in Australia, and why even well-meaning sharing can create real harm.

Then we get practical about cyber security. Think “phishing 2.0”: attackers soften the target with rumours, social proof and urgency, so the later link or attachment feels safe. We walk through scenarios like fake merger news, “highly confidential” PDFs that deliver malware, and the very real risk of deepfake video or audio being used to pressure finance teams into making transfers. We also dig into everyday traps like paid search ads that sit at the top of Google, QR codes that hand over account access, and AI-generated summaries that confidently repeat the same misinformation you were trying to fact-check.

The big takeaway is behavioural and process-driven: don’t trust by default, verify by design. We share clear steps like out-of-band confirmation, risk-based checks, and building a culture of healthy scepticism without turning work into a paranoia fest. 

If this helped, subscribe for more Tech Insights, share it with someone who approves payments or manages staff, and leave a review telling us what verification habit you’re adopting next.

Welcome And Key Definitions

Alisha Christian

Good morning, Martin. Good morning, Chris. Good morning. Welcome back to Tech Insights. Good to have you both back in the hot seat again. So today we're here to talk about disinformation and misinformation. So to get started, do you, Martin, would you like to give us a little bit of an overview on the difference between the two?

SPEAKER_00

Well, basically the difference is that misinformation is incorrect information, but it's not malicious. It's just people getting things wrong. Disinformation, on the other hand, is deliberate misinformation. So deliberately obfuscating the information so that people don't understand what's going on. So it's used quite often in political elections, it's used a lot for criminal activity. There's a lot of uses for disinformation. Misinformation is can be equally as problematic, but it's not necessarily designed for harm.

Alisha Christian

Is it a bit more like Chinese whispers?

SPEAKER_00

Well, it's it's more like people, you know, they want to believe. So there's quite a piece around uh uh especially the internet where there's a lot of information. A lot of information is shared across a lot of different platforms, social media, even just browsing websites, you'll see all those those advertising um grabs that come up on websites these days or or on home pages like on Edge. And a lot of it is actually incorrect. It's it's designed to either capture you into as a like clickbait type scenario where it's trying to capture you with a a salacious uh headline to try and get you to click on it, and that can be where it's uh kind of a criminal type scenario where it's trying to capture you to sign up for something or purchase something that you didn't need or steal your credit card details. There's a whole lot of reasons for doing that. But yeah, it there's there's just a lot of information around.

Alisha Christian

There is information overload, definitely information overload, that's for sure. Have you noticed any um trend with misinformation spreading on digital platforms like across Australia? Like obviously, you just kind of touched on it a little bit.

Why False Info Is Rising

SPEAKER_00

Uh well, the trend is that it's increasing. So that's that's the unfortunate trend, is that uh the amount of information that's going around is is quite uh you know hectic at the moment. And what we do see, one of the trends I I'm a little bit scared about is the fact that those traditional media sources that we used to rely on for you know journalistic um integrity quite often are just you know re-posting or or reiterating information that they've seen online somewhere. So it's not even necessarily correct either.

Alisha Christian

Yeah, so they're not doing the fact-checking like they would have years ago when they had to, when they had no choice.

SPEAKER_00

Yeah, back in back in journalism school where it was you have to verify your source, they just don't see new doing it. That that's a scary trend for me, is seeing that.

SPEAKER_03

And then from a cybersecurity uh perspective of where they can use disinformation or misinformation for that matter. So misinformation will just be a happy coincidence, but they both act as a force multiplier for the effect uh that they can have. So they're starting to like term it like, you know, uh phishing 2.0. So the idea is if I can get uh some misinformation, or if they started it disinformation, because it's just about the intent, is um I can get people to believe something so they're more likely to click. So for instance, oh, everyone is doing this. That could be the misinformation that's going out so that you're more inclined to click uh or not question something. So, you know, it uh again, it could be um uh misinformation that's spread within the organization from well-meaning people and staff going, Hey, there's this security thing, and we have to uh paste this command in uh to protect our systems, and then you get the phishing email that tells you to do exactly that, and you've already heard from a number of employees that that's what's going around, you're way more likely to do it. And then that becomes a huge problem. So that's what I mean by a force multiplier. So now instead of their click rates, the you know, the threat actors uh click rates of like less than one percent, the people are on guard, right? Now they're getting a good 10% of people clicking through. So that's pretty high.

Alisha Christian

So the definition, yeah, exactly.

SPEAKER_03

That's a random percentage. I'm just saying it's high. I shouldn't fact-check that. Yeah, do not fact-check that. Okay, all right. That's a piece of disinformation. Or is it misinformation?

Alisha Christian

Well, I was actually going to ask about um any real life examples that either of you might have on either misinformation or disinformation that you that sort of spring to mind.

Misinformation As Phishing Fuel

SPEAKER_03

That is literally one that that is going around. So um it's uh it's just that kind of pre-work, right? It's kind of softening the target before landing the actual payload, so to speak. So you're getting people way more inclined uh to actually click on something or do something or hand over information or like think about if you've got a a bunch of staff uh there and you send out, okay. So if you're the threat actor, you design a piece of disinformation in this case, where the CEO has done an interview, right? Uh talking about a merger or acquisition that's coming up, and you've heard nothing about it. But here's a credible report, it's your CEO, etc., talking about this merger, and then an email goes out uh saying uh this is highly confidential. Please make sure you've read the instructions in the PDF to all staff, urgent. Who's not gonna read it at that point?

Alisha Christian

Yeah, that's true.

SPEAKER_03

So now they open the PDF and it's a piece of malware. It's like you're done, and it's all because you've you've seen a fake something go around and think, oh, and it gives you that need to know, like, oh, what's going on? Like, uh are we gonna lose our jobs, or people just click. Yeah, so and human nature actually can't blame them. You cannot blame them at that point, you can't that's the problem, is it's so believable, it becomes a real issue.

SPEAKER_01

Wow.

SPEAKER_00

And fact-checking is hard too, because one of the real life examples is the number of people that are trying to use AI to fact-check, and that in itself is actually a massive problem because quite often what the AI is is generating for you is just what it's found like on the internet on another page where it's been posted. So it could have found the exact same information you came across in the first place, and then but because it's now an AI search or or an AI pop-up on your Google page, your you know, people start to believe that as being correct or that the AI generated information is correct. And really, one of the things that we need to try and get across to a lot of people is you cannot just on face value believe the AI generated information. It it just isn't.

Alisha Christian

Okay, that is a problem. Like a lot of people are not educated around that fact.

SPEAKER_03

No, no, even though the warning is up there for almost any chatbot that you use. But I suppose in your Google search where you get the AI summary at the top, that doesn't have that same warning, so that could definitely be more of an issue.

SPEAKER_00

And that AI summary, the number of times I've seen that come up when I've been doing looking for something that I need to find, and I've gone, that's actually wrong. It's just just straight up wrong. They they've found a site that is is providing misinformation in this instance and regurgitating it as an AI, and then people go, Oh, well, that must be correct.

SPEAKER_03

It's kind of it's kind of similar where you looking up um something, um, you know, let's say it's your your WhatsApp and you want to get WhatsApp onto your desktop, let's say. So you got it on your phone, but you want to carry on answering people or whatever. You don't want to go back to your phone, you want it on your desktop. So you just do a Google search of you know WhatsApp desktop, right? And Threat Actor has just paid for an ad on Google. So it comes up way at the top. So you click on the first thing, it obviously you can make it look like WhatsApp pretty easy, and it's just a QR code, right? You scan the QR code and you've just given access uh to your account over to uh an attacker. So it's it's very, very similar. It's like just trusting that initial top search result is where the problem is.

Alisha Christian

Um so AI is obviously having a big impact on the misinformation and disinformation. Um, are we seeing like um with cyber criminals how they're using it? Like I know you've obviously just mentioned about you know, ad words and stuff like that, but in any other ways, like social engine engineering and yeah, of course.

SPEAKER_03

So, like like I said, like you know, creating up a video. Generally, this is gonna be targeted at a larger organization because it's gonna take some effort, right? If if I've got to create up a video of a CEO saying something specific that looks real, um and it's very easy to do, by the way, doesn't cost a lot of money to do. It does take a little bit of effort uh to put together. And if you want it to look good as an interview, you're gonna have to chuck it in Capricard and do a few things. But it's not impossible, right? So, but it's gonna take some effort. So if I'm gonna go through the effort, so Threat Act is gonna go through that effort, they're gonna be targeting probably a larger organization or at least an organization that they know is very cash-rich, so that they can uh extract some sort of money, or there's a weakness in a process somewhere where they can get that payment somehow. Like an example with the CEO. If the CEO is talking about some sort of merger or acquisition or something like that, and then a follow-up email goes to like the CFO or someone that can uh handle a transfer and goes, Well, you obviously know about the merger acquisition, we need X amount uh transfer to our lawyers, to their trust account to manage the merger, right? Like if someone just jumps on that and does it, and it has happened, by the way, quite a few times, and we're talking millions of dollars, right? But that's where I said larger organization. Like a small medium business in Australia is not going to transfer 25 million over to handle it. Not like this particular example, yeah, right? 25 million, gone, right? So um, you know, I don't think it's a a very specific threat to small business at the moment. But the concern Martin and I have is you look at the pace of technology and how much easier it's becoming to actually do this, right? That that's the problem. Because then I can have if I don't have to put a lot of effort in, then I don't have to aim at 25 million. Maybe 100,000 is enough. And you know, a small medium business might be able to afford that. So then I can create many of them, right? If I can automate it with AI and get it out there, so again, that force multiplier. If I'm just, you know, I it I don't need an army of people to get this done. I could sit there one night and smash out like 12 different examples and you know, maybe four of them hit, you know, I've got like half a million in my bank account at the end of the day.

Alisha Christian

Yeah, that is quite scary if they're able to do that for sure because someone's definitely going to click.

AI Summaries And Search Ad Traps

SPEAKER_00

Well, well, the other the other way that they use that same technology is when they're they're targeting individuals rather than businesses, where they'll you know generate an image of Warren Buffett or something saying you should invest in this this particular company or this absolutely this kind of financial scheme. And people go, Oh, well, if Warren Buffett says it or you know Coshi or someone says that they they should do it, they go, Yeah, that's a great idea. And small business owners, they might have a bit of cash sitting in their account. They go, hey, I could actually put my money into that. And what you can't believe is necessarily that it is Warren Buffett talking about it. You've got to verify that that is the case. So you've got to go and search for you know the legitimate site where that information should come from, rather than just um, you know, a clickbait ad on a website.

SPEAKER_03

And it's and it's pervasive, it's it's absolutely everywhere. So if you are on social media, like a lot of people are, and you see your feed, whether it's uh TikTok, if you're younger or if you're older like me and just Facebook, you know, doesn't it it doesn't matter what you're what you're on, right? You it they all have algorithms running and people uh will pay to have certain things there, etc. And you can promote and et cetera uh pretty easily. But the thing is the way you can enhance with AI, it doesn't even have to be you anymore. You can have a full AI influencer, and a lot of people are not recognizing it. You used to be able to recognize it, right? Now it's getting really, really, really hard to actually pick out. Hold on a second, this it's not even a real person that's doing this. So if I just get a um an AI influencer, the person's not real, they they speak nicely, they look nice, they all that rest of the stuff that does work and pushes it up on the algorithm, right? You're getting that in your feed, they're selling you, I don't know, some multivitamin or something, and then they get someone you trust. So think of uh anyone that's on some sort of podcast that you watch, you know, whether it's diary of a CEO or any of the top ones, right? I grab that and I just get a snippet of that person saying that that multivitamin is amazing, and it sells like hot cake. But but that's another way of just extracting money for potentially useless uh stuff. Um I mean the multivitamin might be good in this case, I don't know. But you but you get the idea, yeah, right? And it's very, very quick. Have have either of you seen that um, I don't even know what it's called. It's that it's a guy that comes up and they're calling him like the um it's kind of like a an aboriginal uh guy that does uh very much like Steve Irwin and he talks about the animals and he jumps in and there's a snake and he talks about this. Have you seen any of that? No, I haven't seen that.

Alisha Christian

Well, if it's on social media, Martin definitely hasn't seen it.

SPEAKER_03

Absolutely not. That that is fully AI. Like go go and look it up. It's absolutely insane. Because I looked at some of them you can see, but I had seen some of it, and I thought, oh, that's interesting, that's oh, cool. Yeah, that's all I thought. I did not think it was AI.

Alisha Christian

So, what chance did the rest of us have if you didn't know that that was AI?

SPEAKER_03

Because it wasn't selling me anything or it wasn't trying to get me to do anything, I didn't have my, you know, I'm looking at this carefully. It was kind of just uh, oh, he's talking about some snake, and that's kind of cool. That's all I thought. Then I saw in some, I don't know, which news report saying, Oh, there's something you should know about this person. And I was like, Oh, yeah, I've seen it. He's he's pretty cool. I'll go click on that. What is it? Oh, it's AI. I was actually pretty shocked. I was like, whoa, okay. I'd only seen one or two videos of his when I went into his page, his page, the AI's page, yeah, to go look at more. Yes, some of them look a little bit, you know, uh worse. But some I I didn't know.

Alisha Christian

And I guess that's just gonna get scarier and scary, isn't it? Because the technology is gonna get better and better, so it's just gonna be harder and harder for people to actually be able to.

SPEAKER_03

It's about it's about wet awareness. And that's why we're talking about it. You know, if you need to go out and look for yourself as well, what's there? Um, I'm doing um, I'm doing a um a bit of a training session with a um a board uh of a of a business uh to go through some of the AI stuff. And just as a setup, what I'd what I did is I had contacted the CEO and I said, Hey, just to land this properly with the board, would you mind if I actually use your image in a video to show them what we're talking about? And he went, Yeah, he jumped on a call with me on a Teams call and I showed him what I would be doing. So he was clear that I, you know, I wasn't gonna make him do anything weird or illegal. And he he looked at it and he went, no, perfect, right? So he's not gonna tell his board about that. I'm just gonna run that video in the training session so they can see immediately. This looks like their CEO doing something uh that he wouldn't do, you know, that kind of uh strategy. So that just gets it to land properly, going, Oh, hold on a second. That's something I need to actually be aware of.

Alisha Christian

I mean, that's some pretty impressive training though, to be able to show people, you know, that this is what could potentially be.

SPEAKER_03

Oh, you know what the best part was? That probably took me about 20 seconds, probably cost less than four bucks. That that's the problem. Is I literally called him and went, Hey, do you mind if I do this? He was like, Yeah, cool. And I did it on the call and showed him, like live.

Alisha Christian

He would have been pretty impressed.

SPEAKER_03

So, yeah, and I showed him a few other things we could do that was like less but even easier. Like it's really, really simple to do. So there's a lot of tools out there to get it done. And if those tools are available to just anyone, and then I can automate it and I can make it better, and I can actually put a little bit of effort around it, it it becomes very believable.

Alisha Christian

So, how do you recommend, like, for example, to this board or anyone that you've spoken to, Martin, how do you what do you tell them to look for? That's a great question.

SPEAKER_03

Over to you, Martin.

Alisha Christian

Do you feel that buff coming?

Deepfakes Targeting Money And Trust

SPEAKER_00

I think what it is is is around you you need to question everything. So it comes back sounds exhausting. Uh it is, but it it is, but it isn't. So it's kind of that scenario where you know, we used to learn it. If it sounds too good to be true, it probably is. Yeah. We need to get back to that point and uh again is that if it sounds too good to be true, you know, it probably is. Have good processes in place is another thing. So if, like uh Chris mentioned, that that CFO scenario saying and and the CEO telling the CFO to pay a bill or something, have processes in place where you check it with a different form of communication. So a phone call to their phone rather than you know a team's call or clicking on something that they've said, hey, click on this link or whatever. Have a different process for for confirming that. If you know companies change bank accounts, confirm it with a different way. So you've just got to make sure that you have processes in place so that you can, you know, confirm whether this is true or not. And and I think that's where people need to learn is there's kind of that scenario of don't just believe everything you see. Like I know humans kind of get to that point where we, you know, we used to think, oh, we only believe half of what we hear, but you believe what you see because you've seen it with your own eyes. Um but now I think we probably have to go to that point of we believe half of what we see and and probably a quarter of what we hear. Um so it's kind of having a very um large dose of skepticism around things. So so I would probably look at it and go go into it with the idea that I don't believe it and you can convince me otherwise rather than I believe it until I I I'm told otherwise. So it is in a business sense, it's process, is kind of one of the things.

SPEAKER_03

It's very much process. So out of out-of-band checking, and we were we obviously speak about this a lot uh internally. We go, okay, how do we advise uh customers, your family, uh, etc. And what it comes down to then is like Martin said, it's kind of going back to the basic, right? So if you can't trust uh digital, what do you do? Well, it becomes in person, right? Yes, so at least out of band, so maybe a phone call. Uh and then if and then I'd say risk-based as well. Like if you're asked to transfer 25 million, then there's gonna be a lot more checks, right?

Alisha Christian

You'd like to think so.

SPEAKER_03

Yeah, so if you've had a video call with the CEO and then you're still like, I I don't know, and then you get on um and and then the CEO calls you, but it's 25 million, I'm not gonna trust the call that's coming inbound.

Alisha Christian

No, of course not.

SPEAKER_03

Exactly. So I would make that call the other way, and even then, 25, I'd probably want to see them in person. Now, again, that might not be possible, depends on if it's a global organization, etc. And that does bring me on to another point with uh small business and working from home. Because what that does is it actually sets up silos, and because we don't have that, are we all working in that office and you know, oh Martin's asked me to pay this bill, and I just look over and like Martin, did you? It's easy, right? I don't have to trust the email that just dropped in there. I just walk 10 paces over and go, Martin, do you want me like and he goes, What are you talking about? It's easy to check. So watch out for siloed activity, watch out for, and these are standard things, by the way, that not nothing's really changed with this. Uh, the other one is looking for those usual tactics of urgency and you know, all you know, that kind of thing. It's like I said earlier, I gave that example of where I get misinformation out that there's a security problem, and people are talking about having to paste something in, right? So that gives you that uh support of the crowd situation, like go along with the crowd. And You've got to take pause in that and go, hold on a second, but that it doesn't seem right. Yes. Yeah, if if if the crowd wasn't talking about it, would I trust this particular email that just came in? Probably not. So then you then you you'd question. So it is about questioning, double checking everything. Uh and yes, it is a little exhausting, but we're kind of coming back to that human humanness, let's say. It's back to basics, talk to the person and find out.

Alisha Christian

Because I think we have got so reliant on you know online meetings and you know all the technology, which is fantastic. But I mean, I work from home, so I wouldn't be able to just, you know, look sing out to Martin, say.

SPEAKER_03

Absolutely. Well, that that example I gave with the 25 million being transferred off wherever from the CFO. Uh, the CFO did jump on a video call, but it was a video call with the CEO and a bunch of other um, I can't remember if they were managers or or board members. I can't remember, but they were all AI. They had set up the entire system. It was only the CFO that was in the dark, so to speak. So there was again that crowd going like, yep, agreeing with the CEO, except it that's hard.

SPEAKER_01

That's very hard.

SPEAKER_03

So it it's that sort of thing. And again, obviously that's large, right? It's 25 million. So when you're talking about something smaller, yeah, it's gonna be a bit simpler to pick up, hopefully. And if it gets worse, then again, we're going back to basics. We're going back to trust, right? Yes, only thing we can do. You might you might have heard the term zero trust, right? It's a and it's a common cybersecurity and IT term, but it's there for a reason. And it's there because of all of the stuff that's coming in. And what it basically means is going, you know, you need to authenticate who you are, like prove who you are to use the systems. Now you would know that as your username and password.

Alisha Christian

Yes.

SPEAKER_03

And then you know, okay, but that can be hacked. So then they have to add on, right, multi-factor authentication on top of that. And then that's just to strengthen the identity, right? This is who they are. But then you've got to worry about the device that they're coming from. Yeah, because what if the device is compromised or there's a problem there, right? So you you don't trust the identity until they prove it. You don't trust the device until it can prove itself to the organizational systems. And it's the same thing. So you don't, it's back to the you don't just trust the initial what you see, and it's verifying. So it's always verifying. Again, I think Martin is perfectly accurate, going, if it sounds too good to be true, or in this case, that sounds a bit odd, then that that's enough.

Alisha Christian

Um, do you think there's enough training around you know these sorts of I know you said that you're obviously that you're training um with boards and that sort of thing, but do you think it's filtering down like to the staff?

SPEAKER_00

It's even it's even earlier than that. I don't think there's enough training for uh, you know, children and teenagers. I think the whole concept of banning uh you know young people from social media is is uh quite a lot of it is around the fact that there was so much misinformation around and it's easy for them to be um you know taken advantage of. So the the point is we have to start very early. It's like financial literacy, you have to start young, you know, digital literacy has to start young, you know, and that's and that's definitely where it has to start. And I and and beyond that, no, most businesses aren't even probably addressing this at all internally.

SPEAKER_03

Yeah, which is unfortunate. And it it's it's uh it's a very large problem, not something us three are going to solve. It's not disappointing, but uh, but you're right, like it has to start all the way at school, and you know, at least I I will make note so Queensland uh education does have cybersecurity in their curriculum.

SPEAKER_01

Oh wow, what yeah is that uh it goes all the way all the way down.

SPEAKER_03

So in into primary uh schools, uh cybersecurity is a smaller item on the curriculum, but it's in their curriculum nine. Uh so it is out there, and some schools are taking that to to heart. Um, I know uh for instance uh Helensville uh primary, um they do have that in place. In fact, um I'm probably gonna go and have a chat uh with their staff, maybe not the kids at this point, but certainly with their staff around some cybersecurity uh stuff in February.

Verification Habits For Work And Home

Alisha Christian

I mean, that's a great initiative because, yeah, like you say, it does need to start young.

SPEAKER_03

And it's awareness, it's just awareness at this point so that they understand. And yeah, you're talking about our most vulnerable in the community, kids definitely. It's like you know, the whole social media ban. It's like it I'm I'm not saying that was the the greatest way to get it done, but something had to be done. Yeah, that's it.

Alisha Christian

It definitely was a good shift, it probably hasn't completely eradicated the problem, but at least at least it starts a conversation. Exactly. So that's right.

SPEAKER_00

Well, it makes parents aware too. All right. And that's I think that's that's a big part of it, is parents were not necessarily aware. And and for the most part, they're some of the people that are looking at uh you know a news feed on Facebook and going, Hey, did you hear about this? And you go, Where where did you hear about that? I go, Well, no, no, it's it's news. You go, is it news? Yeah, what news from where?

Alisha Christian

Don't ever take air news to Easter Martin.

SPEAKER_03

Yeah, and at the end of the day, these parents work out come companies. So it feeds it feeds through.

Alisha Christian

Yeah, exactly. No, that's really interesting. I'm gonna ask the kids now if they've been learning about if there's anything there, right?

SPEAKER_03

Depends if they've got, for instance, a dedicated like Digitech teacher in place as well. But I know it definitely is in the curriculum, so it should be being taught or getting there because they're not kind of forced to go to version nine straight away, like they could be moving it, like they might decide to aim at getting math uh onto the right version, and yeah, so it's got to all be handled. Like, can you imagine handling a timetable out of school?

Alisha Christian

No, I think it's like I'd rather run a business by myself. I think there's about 3,000 kids at my son's school, so I definitely wouldn't want to be running that timetable. That's for sure. Although they did have a um I can't remember his name, but it was a cybersecurity expert that um did like a talk to the um high school parents. Oh yeah, and out of 3,000 students, I think there was about 25 of us that turned up. Oh, really? Yeah, it was actually a really interesting talk, and it was just before the social media band came in and yeah, it was just talking about you know all the things parents kind of need to know. Obviously, it was not fully in-depth, but I just thought, oh, I was kind of disappointed that not more parents had come along to I I think we humans do get into that trap of yeah, I know all about that.

SPEAKER_00

I don't need to be told again.

SPEAKER_03

Unfortunately, yeah.

Alisha Christian

I almost didn't go because I'm like, oh, I work for a cybersecurity company, I don't need to go.

SPEAKER_03

And did you learn something? I did, and I thought that's exactly the point.

Alisha Christian

We uh obviously we do do cybersecurity, but our main focus is not kids. Yeah. So I thought, well, it's only down the road, it's only an hour. Go go have check it out and go have a look.

SPEAKER_03

Yeah, so yeah, I totally agree. And like any any opportunity to kind of learn and see what's uh available. Like a couple of years ago, I did some talks at uh neighborhood watch uh with with the police uh for sales down in I think Tweed somewhere. Uh and that's the same sort of thing where you're getting a an older demographic and again talking about vulnerable uh people. So that's very, very useful. And and uh it's it's a very good point uh that you both make with the uh you think, oh I I've seen it and I know it, right? Like obviously, like or I work in it so I know it, or I've seen it before, I've read it before, or you're relatively smart, like you know you you're pretty smart, that's the worst, right? Uh you know, speak to uh not that you've got access to a brainwashing expert, but if you if you speak to someone that actually understands uh the psychology and etc, they'll tell you someone that's very smart is easier to manipulate because they think a very specific way. So if they line these things up with enough social proof, yeah, like I was talking about, that crowd backing it, like that poor CFO with a whole bunch of AIs telling them to do the transfer, it's hard, right?

Alisha Christian

Yeah, that was pretty impressive. Um, is there anything else that you'd like to add or anything else that people should be looking out for?

SPEAKER_00

Look, I I think when when you know I was much younger and getting into management roles, I was I was told by uh uh a very senior executive that you know you you you should trust but verify. So that was the thing, you know, trust but verify. So if you you know someone tells you something, you should trust it, but but just check. I think now we have gone beyond that. You can't trust. So it's now it's don't trust and verify. So that's hard, no, yeah.

Alisha Christian

That doesn't come naturally to me.

SPEAKER_00

No, it and it it probably doesn't come naturally to a lot of people because you want to believe if someone tells you something, you you just want to believe that. Um, you know, that's that's kind of that human interaction, you know. You don't want to just turn around and say, Well, I don't believe you. Well, I do. I say that to everyone when they say, tell me, oh, did you hear about this again? No.

Alisha Christian

No, no, anyone that says that to you though is know you very well.

SPEAKER_00

But I and I'm not saying that everyone needs to get to my level of uh paranoia, but but having a healthy dose of paranoia wouldn't hurt. Yeah, exactly. Yeah.

Alisha Christian

So well, hopefully we might get back to a bit more face-to-face business where possible as well.

SPEAKER_03

Well, we're seeing that as well. So that's some of our events coming up this year. We're doing a lot more face-to-face as opposed to just online. And I think that is that is useful, allows uh much more in interactive uh communication and builds trust. Yes, definitely. Yeah, you know, so if you've got a managed service provider or managed uh security service provider, you do want to be able to trust them, right? You've got to have that face-to-face relationship and building that up uh with with your business.

Alisha Christian

And I think face-to-face also gives you that that bit more opportunity to, you know, talk about real life examples in an you know, just a normal conversation. Whereas I think when you're online meetings or messaging, you don't really have that same natural opportunity to, you know, talk about things that might you know relate to their situation.

SPEAKER_00

So the conversation does change over a cup of tea or a coffee. Yeah, it's it's a little bit different than you know when it's quite sterile across a screen.

Alisha Christian

Yes, yeah, it doesn't have that same natural flow. So I'm all for the person in person.

Awareness Training And Closing Thoughts

SPEAKER_03

So I think the the final uh kind of words there is probably around that awareness. Like, you know, if if your staff um and your business business hasn't had a an IT cyber awareness program set up, definitely get one set up. Make sure AI and disinformation and so forth is in there as well. So I know we kind of kicked off our uh cybersecurity awareness programs. Uh, we always kind of do a refresh at the beginning of the year and then a couple through the year. But when we start, uh we've definitely got AI in there. Uh, I think we had it in since the beginning of last year. So we'll just do a refresh of that as well. So again, yeah, don't do a uh once and forget for the year.

Alisha Christian

What's that repetition, isn't it? Yeah, just get it in.

SPEAKER_03

It doesn't have to be horrendous, like you know, like beating it into people because the rest you just go like, I don't actually care anymore. Uh like you know, it's like mandatory training. It's like click, click, click, click. Yes, I agree.

unknown

It's like no.

SPEAKER_03

So you do need a little bit of uh what do they call it, like nudge theory. Hey, just a reminder, this could have been bad if you clicked on it, that kind of thing.

Alisha Christian

Yeah, yeah. So I still think those phishing emails are definitely a good reminder that we can and again, don't flood people with that sort of stuff because it gets annoying.

SPEAKER_03

Yeah. Uh just but every now and again, just a reminder, hey, stay paranoid.

Alisha Christian

Is that gonna be our new our new tagline? Sure.

SPEAKER_03

Yeah, well, it's I think it's Martin's tagline.

Alisha Christian

Uh well, thanks um to you both for joining me again. It's always great to have you here, and uh yeah, no doubt I'll be having you back very shortly to share some more tips.

SPEAKER_00

Thank you. Thanks, Alicia.