Tech Insights with Alisha Christian

AI’s New Tricks, Old Scams

Mercury IT Season 2 Episode 3

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 32:34

How criminals are using AI to scale familiar fraud!

Think you can still spot a scam by bad grammar and odd phrasing? That edge is gone.

We sit down to map how AI has supercharged classic fraud—making phishing emails flawless, internal chat messages feel authentic, and even voicemails sound exactly like your CFO. The twist? These same tools can help you work smarter, if you build the right guardrails and culture around them.

We start with the modern account takeover, where compromised supplier inboxes deliver perfect‑looking document links that quietly steal credentials. From there, we break down mimicry and deepfakes: how a single headshot from a team page, plus a few seconds of public audio, can be turned into convincing video, full‑body movement from reference clips, and multilingual voice clones. This isn’t sci‑fi or enterprise‑only anymore—it’s affordable, fast, and good enough to fool busy teams.

We share pragmatic defences you can apply today: limiting staff photos to low resolution, locking down external access in Teams and Slack, and verifying unusual requests through a channel you initiate.

Culture sits at the centre of resilience. Urgency, authority, and secrecy are an attacker’s favourite levers—and they work best where questions are discouraged. We unpack zero trust in plain terms: verify the user, the device, and the context every time. And we explain why “healthy friction” beats speed when money or data is on the line.

We also cover safe AI adoption—using an AI readiness assessment to find blind spots, enforcing clear policies, and pressing vendors on encryption, data isolation, and model‑training practices to prevent accidental data leaks.

If you want concrete, real‑world steps—not fear—this conversation gives you a plan: broaden phishing awareness beyond email, harden default settings, train quarterly with fresh examples, and enforce dual control for sensitive actions. AI is a force multiplier on both sides. Use it to your advantage—without handing attackers the keys.

If this helped, follow the show, share it with your team, and leave a quick review so more people can find it.

New Year, Same AI Surprises

Alisha Christian

Welcome back to Tech Insights, Chris.

SPEAKER_00

Thank you. Thank you.

Alisha Christian

We're in a new year.

SPEAKER_00

Yes. 2026. Pretty insane, huh?

Alisha Christian

That seems to have come around awfully quick.

SPEAKER_00

It did. It did.

Alisha Christian

That's for sure. So I thought I'd get you to come and join me and we could have a chat about. I mean, obviously there's lots been happening in the last few months. Always plenty to talk about around AI. So yeah, great that you could come and join me again and yeah, we'll run through some of the things that have been happening.

Phishing Evolves With AI Realism

SPEAKER_00

Absolutely. Let's do it.

Alisha Christian

Yeah. So uh one thing I was going to ask you, really, I mean, I know we've spoken quite a bit about AI and cybersecurity, but just sort of wanted to get your take on how much you think um the cybersecurity landscape has changed in the last like 12 months.

SPEAKER_00

Look, um it there there's a lot of there's a lot of the same. Um I think the the key thing with AI is just pushing the realism is number one. So the and where the realism comes in is that believability. So it makes the impact of um whether it's phishing or any other you know, invoice scams or whatever it is, that that impact is a lot more because they're more believable. So, you know, often you'd be able to go, oh, it's a phishing scam because there's bad spelling, bad grammar, etc. Now AI is writing it, it's writing it perfectly. Um it makes it very difficult to to pick up. So that's that's definitely one element of it.

Alisha Christian

Yeah, okay. And um could you walk us through like some real life examples of how you've seen cyber cybercriminals infiltrating like our everyday systems?

Account Takeovers And Believability

SPEAKER_00

The biggest one is still around uh account compromise. So in other words, someone has uh managed to get your username and password or managed to use your account or what's known as your identity uh within your business to access the information that you can access. And there's lots of ways of uh doing that. Um the most common is still phishing. So it's still that email that's coming through that's asking you to, you know, download this document. So it's like, you know, your accountant has sent you a document. Click on this link. And you know, it is your accountant because they've been hacked. So you click on the link and go, oh yeah, that like what is it, the latest tax now? I can't remember what it's about again. And you click on the link and then it goes to Microsoft page and asks you for your username and login. Now, hopefully most people pause there and go, Oh, okay, that's that's a bit different. Why is it asking me for these details? So, because quite often what's happening there is you clicking on this document link, it's taking you to a phishing site. So this is the fake web page that's asking you for the username and password. You put that username and password in, it's going directly to the threat actor, the hacker, right? That wants to get in. Uh, you put the details in, hit okay, nothing happens. Document doesn't open, or you know, just times out, and you think, oh, that's a bit odd. Like you might even email back and go, uh, I try to click on this, it didn't open, and and your accountant replies back and goes, Oh, yeah, sorry, like we've been hacked, uh, we're kind of sorting it out, it's all sorted now. Uh, no information's been compromised. You know, that's generally what they say. Uh, but your account's already compromised. You've already given your username and password to to the threat actor. That's the problem.

Alisha Christian

Yeah, you wouldn't be very happy, would you?

SPEAKER_00

Yeah, and it's and the problem is is you've trusted that source because it's coming from your accountants. It's their email, there's it's it's written perfectly, it's asking you to look at this document, you like they do all through the year. So you go and do it, right?

Alisha Christian

You um yeah, heads up from the accountant would have been helpful though, wouldn't it? Yeah, yes.

SPEAKER_00

Uh and quite often, like if if we do see stuff like that, then we would normally reach out to them and go, hey, this is what's happening. Like, are you are you guys across it? Uh, etc. So try and try and offer some some help there.

Alisha Christian

Um when I was sort of doing a bit of a research around, you know, obviously all things AI, um, came up with the Mimicry technique. What is that exactly?

SPEAKER_00

Um, look, uh, when you it's it's an interesting one. It's not it, I I wouldn't say it's something that uh small medium business in Australia necessarily has to worry about. Yes.

Alisha Christian

I hadn't heard it before and I thought, hmm.

SPEAKER_00

Yeah. I mean it it's it's just to mimic something, right? So if if so it's like taking an image of someone and then making that say something as an example, right? So let let's say a threat actor can go and get an image of a person in authority. Like let's say they can go and get the image of uh the CFO. You know, if you go onto a web page and it's like about us, our team, and then it's got an image of the entire team, right? And I go, oh, there's the CFO, useful. Grab an image of the CFO, load that into my AI program, which there are hundreds, and I can make that CFO say anything I want. Right now, it depends how much I need to do, right? So I could use just a generic voice, a male voice, and maybe that's enough. Maybe that won't fool some people. So maybe I can do a quick search on the internet, find that CFO where they've given maybe an interview somewhere. Now I do have the voice. I only need a few seconds of voice. I've already got the image, and now I I can mimic the voice as well as the image. Then who's gonna know? Now, at the moment, a lot of people go, ah, I can tell a fake. It's like, can you? Have you have you seen what's coming out?

Alisha Christian

I think you did that example um a few months back. Yeah, you cut it was like you sort of mimicked yourself almost.

SPEAKER_00

Yeah, it's easier to mimic myself. I can give it, I think it was about two minutes of footage and voice for it to copy. And now, just based off that two minutes time, um, I can get myself to sit there and say anything I want in any language I want.

Alisha Christian

Oh, any language. Oh, yeah, no, I'll just flick a language easily.

SPEAKER_00

So often when I do awareness programs for staff, I will show them um, you know, that you can flick it into different languages because it's very useful. Like, think about the uses of that. From an AI perspective, I need to do training in a global organization and I need to deliver it in six different languages. Oh now it's very easy. Very useful. Technically, I could just record the one and then change it, or even better, I don't need to record anything. I can give it two minutes and just stand up an avatar that mimics me, my actions, my voice. So I will be speaking Chinese. It's still my voice in Chinese, so it's not dubbed over, it's my voice in a different language.

Alisha Christian

I mean, it's gonna be amazing and it's it's awesome, it's fantastic.

SPEAKER_00

But the problem is threat actors can also use it.

Alisha Christian

Yeah, it's amazing if it's not used for evil.

SPEAKER_00

Exactly. So, and that's the same with any technology, I suppose. But and that's where we get into a bit of a problem where people go like, oh, I'd be able to notice. It's like, and like I say, it's like, can you? Like, I I think I can across a lot of it, and but there's a lot that you can get caught out on. I have seen some where I'm like, whoa, I'd like that is very good at what it's doing. Like some of the latest um tech that's been released um can do a an image of someone, um, and I can give it reference movement. So to give you an example, I take an image of you, because I I have one, it will be is it on our website?

Alisha Christian

Yes, it is. So I just grab it all.

Defences Against Visual Deepfakes

SPEAKER_00

I just grab it off our website and go, okay, there's Alicia, right? And then I grab a uh a video of someone tap dancing. You know, like in Mary Puppins. Yes, yeah, where they're doing that little dance, and I can get you to do exactly that.

Alisha Christian

Well, I have always wanted to know how to tap dance, so it will not take more than two minutes.

SPEAKER_00

And it's very believable, and it does full body replication. Where, you know, previously AI struggled with hand gestures and things like that. It doesn't. We didn't have six fingers or it it's it's actually very, very good at at not and it's not good all the time. So you you'd have to run it, but you'd be able to get a relatively believable copy. Like, you know, I think we were talking earlier about like, oh, if you put your hand in front of your face, AI struggles with that. It doesn't, not anymore.

Alisha Christian

Well, I guess the tools are just getting better and better, aren't they? So that fine-tuning and you know, who knows what's gonna be.

SPEAKER_00

And it's all and it's all available and it's easy to do. Like, you know, it's it's and it's not costing thousands of dollars. You know, I I I did a uh demonstration of this where I've taken uh a CEO's image with their permission. Yeah, uh, I I spoke to them about it and get them to do certain movements, like that imagery. And what I've done just to show that is I did the reference movements, you know, whatever it was, you know, and got them to do this the same so they could see, hold on a second. So I'm getting just a static photo of their CEO copying my exact movements in real time. So that that's the type of stuff that you need to be aware of.

Alisha Christian

Do you think um, you know, you said that you can just go to a website, take my photo, marry it up, da da da. Do you think we're gonna see less and less of uh, you know, meet the team and meet the team where there's like literally no photos everyone's like I mean, maybe we'll see a bit more tech uh around um like embedding into that photo so it doesn't work in AI?

Pretexting And Multi‑Channel Lures

SPEAKER_00

Something like that. Is that actually is that uh I don't I think I might have read something similar to that. Um I know there's like authentication markers that they can put in. I know museums are looking into that sort of stuff. Um a good example would be just to put quite low res images on the website. Don't put a high res image on the website, you don't need it. Yes, you know, you just want to kind of see what the person looks like. So it's like, you know, it makes it a little bit more personable, like when you're calling, you know who you're speaking to. Yeah, exactly. That kind of thing. Uh or if you're gonna do a video call, you know, oh, that is Alicia. I've seen a photo of her, that makes it a lot easier, right? For sure. It's a little bit more comfortable, it's a soft landing. Yes, right? So just have a lower res image there because if you've got a low res image, getting that to then move and do the stuff is gonna be pretty hard. You know, a lot of the tools, if you try and upload a low res image, it's just gonna say, Oh, that image is too low to use. Yeah, so you're kind of stopping it a little bit by doing that. Not perfect, but it's something.

Alisha Christian

Yeah, I was just curious when you yeah, you were talking about the website, and I was like, oh, it'd be interesting to see in the next few years, you know, if that becomes a bit of a trend not to have images and well it's the same.

SPEAKER_00

A lot of websites used to have uh the email and mobile number of the team there as well. And I I used to for years, uh Martin and I used to say to customers, can you please take that off?

SPEAKER_02

Yeah, right.

SPEAKER_00

Uh especially for uh like uh lawyers and things like that, where like you know, a threat actor is now can target a specific person directly on their you know email and and their mobile and like you've got everything. Yeah, that's it. And the more information you have, there's a uh concept called pretexting. Uh again, if if in the phishing email, I can give you just enough information that it feels like they know you already or have met you, you're gonna be more likely to trust it. Oh, absolutely. So an example would be, you know, I see on LinkedIn that you're doing an event somewhere, and I just follow you for a couple of weeks. I get an idea. I send you an email and I say, Oh, you know, I was at this event, I saw you and Chris was speaking about this, and I was quite interested in the services that so now you're like, oh, it could be an opportunity, da-da-da, you know, and it's like, you know, but you know, we need an NDA sign because we deal with a lot of like top secret stuff or whatever they want to say, right? And you know, they if they've got then if you've got the mobile number, so they've emailed you directly, they've got nice pretext there, you get a connection straight away, and then they send you an SMS as an example, going, Hey, I I just wanted to follow up on SMS so you know this is a real uh opportunity, and I thought I'd just send you the link via in case it was blocked by your email.

Alisha Christian

Well, that's true.

SPEAKER_00

So now you get that double up and it it's that confirmation bias, unfortunately. It feels real, and then you get another channel of communication, so it feels more real.

Alisha Christian

And it already feels a bit warm and fuzzy because if you've made out that.

SPEAKER_00

Or what if what if the threat actor is watching you, sees who comments, etc., manages to then mimic somebody that you know's email sends an email to you and says, Hey, I've got a friend of mine that is in this business and they're quite interested, they'll be following up with an email. And then that email drops in, and then you get the SMS.

Alisha Christian

Like you're you're I mean, you're not gonna, yeah, exactly.

SPEAKER_00

That's so that that's where you've got to be careful about how much information's out there already that someone can use and you know manipulate. And again, that this is not new stuff. It's like you know, con artists have been doing this for a very, very long time. Yeah, I rewatched uh Ferris Bueller's Day Off.

Alisha Christian

Oh, I love that movie.

AI Readiness And Safe Adoption

SPEAKER_00

Well, do you remember when he was at the restaurant and he looked, as the guy turned away, he looked at the staff, the sorry, the guest list. Oh, yes, then picked a name, turned about and went, Oh, I'm so-and-so. And the guy goes, Oh, you like you know, the sausage king of New York. And he's like, Yes, I am. And the the guy was obviously not falling for it. Yeah, but then he gets the the girlfriend or whoever to make a call to him and say, I'm looking for so-and-so. And it's like, oh, can you describe? And she describes what he's wearing. So now he's like, Oh, I've made a mistake. Same thing, right? Yeah, same thing. It's like I have enough information just to pass by your defenses, your standard defenses. Yeah, and I think where we're getting to is that awareness and you needing to up your defense level um, or uh just putting it in English, your your paranoia level, you know. So obviously being paranoid about everything bad, yes, it's not not good for your stress levels, exactly. But it's definitely coming around uh trust, right? You you need to not implicitly trust something, especially if it's absolutely new, yeah, uh coming through. You're gonna have to verify. And then that comes into a cybersecurity strategy of call uh called zero trust. So everyone has to verify, verify your ID, verify the device you're using, etc. Yeah.

Alisha Christian

Yeah. And um, I guess along that similar line is um voice cloning, because that's also something that's you know, not just the visual, but the actual um voice on the phone. Like, is that something that you're hearing more about with businesses?

SPEAKER_00

I I've personally not not heard of that um directly, but I've read about it, so I know it is happening. Yeah, but I haven't had any direct personal experience on that. And it's kind of where I started that conversation off going, I don't think a small medium business is gonna get hit with a full mimicry of their CEO. Uh, not like you've read in the news where you know that CFO was duped by a fake CEO and a bunch of members on a video call for like 25 million. Yeah. That that's a global organization. Um, the only concern I have is seeing how fast AI is moving and how easy it's becoming to actually do this. You know, when I said up for that uh CEO to do, you know, a weird movement or whatever, like I did that live with him on a Teams call. Yeah. I got him onto a Teams call and I was like, oh, let me show you what I can do. That at 20 seconds we had some sort of video of him and he he laughed about it. He was like, okay, yeah, that's cool. And that's what we'll show to the board so that they have an understanding of how easy this is. But if it's getting that easy, well, then is it going to hit our small medium businesses soon?

Alisha Christian

Well, that's right, exactly.

SPEAKER_00

And that that is the concern is like, okay, well, we need to be a little bit in front, let people know this is what's coming. That's what it looks like. It is very believable. And it, you know, getting people to start to think, okay, pause. Like, too good to be true, or something feels off.

Alisha Christian

Yeah. Any gut feeling.

Privacy, Data Custodianship, Compliance

SPEAKER_00

Any gut feeling where you're like, okay, just pause a second and then take a moment to just use a side channel as a different channel. Yeah. So that was a video, it was an email, you call the person, or walk the 10 feet into their office and go and speak to them. Yeah. So don't be siloed, and there's a lot of work from home, so that's just the natural course. So you need to be able to call the person on that mobile. You know, obviously, if they calling you, that could be fake.

Alisha Christian

Red flag. Yeah.

SPEAKER_00

So you want to be able to make that call out to a known number that you've got. Um, except again, it reminds me of Ferris Buller's over there where there was the call out and you faked that as well. Okay. But yeah, I don't think I really do need to rewatch that movie.

Alisha Christian

I did always like it.

SPEAKER_00

So there's there's all of that uh sort of stuff. So it's about that awareness, uh, getting people up to speed. Don't don't leave your cybersecurity training. Make sure that's covered off and not just once a year. Yeah. You know, make sure you're nudging that along every well, at least every quarter. Just reminding people, hey, that's there. You know, get your paranoia levels just a little bit higher than they because human nature is to trust.

SPEAKER_02

Yes.

SPEAKER_00

You generally trust, especially if someone's asking for help. That's even worse. Because people will naturally want to help.

Alisha Christian

Yes, well, that's right.

SPEAKER_00

Offer the information. Yes. That's helping. So that that's where we've got to just put a little bit of a break there. Uh, yes, that's a pain. Uh, it's more effort, but that that's where we where we are.

Alisha Christian

So you mentioned that um you don't think at this stage small to medium businesses are going to be targeted directly. Not with that high-level stuff. Yeah. What sort of thing are there any sort of more um, I guess, low effort skills. Definitely that the um small to medium business AI related need to be looking out for.

Resources And Live Event Announcement

SPEAKER_00

Absolutely. So where you've got to look then is just where AI is enhancing what's already there. So that that's the the first thing I'll say is that the biggest, the biggest way in for a threat actor is still phishing. Right? So that fake email where you just handing your username and password over, right? So that that is a big red flag, the email and managing that email. And I would say you need to then uh like extend that from email to other messaging channels. So if you use Teams in your business, you use Slack in your business, etc. Those could also technically be fake. So uh in Teams, if that's not been locked down or set up correctly, uh, then you could have an external person trying to mimic an internal person or a provider, and you would think it's right, and it's not, because there's nothing that stopped that. So again, a small medium business might not have a managed service provider or a managed security service provider that's gone, hey, we should lock this down. We don't want anybody just uh messaging into your team's environment because it's open by default. Like a lot of things are open by default because that's easier. Stuff just works.

Final Takeaways On Awareness

Alisha Christian

So if you didn't know to actually change that default, yeah, then you could be it could be a problem. Okay, I know ours would be locked down locked down within an intimate slide, but um I guess people like wouldn't be expecting to be targeted that way. I don't know they're not. There's probably not as much um talk about fishing within teams or Slack or that sort of thing.

SPEAKER_00

No, that's that is relatively new, I would say, which is why I wanted to mention. Yeah, yeah. And it's just an awareness again, just be on your toes or suspicious about what is coming in and out and what's being asked and so forth. So, again, where does AI help? AI helps whereby uh you could definitely get a voice call, but it won't be a live call because that's quite hard to do real time, right? It's not impossible, it can be done. Uh, even a full video call can be done. Um, but if I just call directly to your voicemail and I just leave a pre recorded message, well then I can mimic. Anybody's voice. If I get a snippet of the voice, I can mimic it in 11 labs or some tool, and I can run that voice and I can leave a voice from your child saying that you need to be picked up or whatever, right? So and that and that has happened.

Alisha Christian

And I think that is um particularly scary. I think that emotional scam.

SPEAKER_00

And if you translate to that to business, that emotion comes around urgency, money, authority, compliance, right?

Alisha Christian

Yeah.

SPEAKER_00

Which is why often you'll see that message come from the CEO, CFO, something like that. So somebody goes, Oh, ooh, I need to move on this immediately. Right. And a lot of that has to do with the culture of the business as well. So if you in a culture, probably a little bit more old school, where the CEO says something and it's like jump how high, and I don't question anything, that could definitely be a problem as a culture uh for cybersecurity, um, which we have seen before. Um, versus uh take that pause, check, call the CEO, double check, you know. It's the problem is if you got that culture where you call the CEO and then they shout at you for not having it done already. I was thinking that is where the problem comes in. And like I said, I've we've literally uh had a business many years ago now that was like that. Yeah, okay. That cost them$1.6 million. It went straight to a Chinese bank account.

Alisha Christian

So there's a little note there to have a good relationship with your staff. You need a good relationship with everyone, yeah.

SPEAKER_00

Strange that, just being human.

Alisha Christian

I know, right? Uh so for businesses that have really kind of pushed the old AI security under the blanket, not wanting to really think about how that's going to impact their business. I know you've quite passionate about AI and done a lot of work around it. Um, like doing, I know you put together like an AI readiness assessment, you know, as a starting point for people. Would that be something to you?

SPEAKER_00

Yeah, absolutely. Look, and you're right, like the way the way I see it is um I I don't feel like AI is a fad. I've used it for a very long time, before the the chat bots, like you know, chat GPT, etc. Yeah. I mean it was just you know, it's you know different models, you know, whether it was uh machine learning or you know big data, there was all that sort of stuff, you know, long, long before the um the large language models, which is your your chat GPTs, your Claude, Groc, Gemini, et cetera, came out. And those have got so much better from when they've f first been launched. And they can absolutely help your business, as far as I'm concerned, right? Um uh are they good for everything? No. Yeah, can they replace half your stuff? No, there's there's not a chance. So there uh, but there's certain businesses that it will be way more impact, and other businesses that there will be less impact. So it depends where you need to go and look. Yeah, you understand your business best. Um, if and you might not have time to go and look into all the AI stuff, yeah, then get a consultant in to help you look at what that is. So I think uh in an effort to just get businesses to start looking at it and also be safe around it, um, I did do like an AI readiness assessment, just even if it just starts a conversation for a business going, oh well, like it asks a question and you go, oh wait, we haven't we haven't done that, we haven't even thought about that.

Alisha Christian

Well, because you don't know what you don't know, do you?

SPEAKER_00

Exactly. So that was kind of the idea was to just give them a quick summary at the end, going, oh, these are kind of the things you need to look at. And then I bundled in a like a basic policy as well that you could use. Now, bear in mind a policy unenforced in a business is useless as well. You know, it's like it's kind of like uh you know, business going and getting their ISO 27,001 certification, but the uh statement of applicability is like so narrow, you know, it it it points at that that one laptop there is secure in the business. And it's like we are ISO certified.

SPEAKER_02

Yes.

SPEAKER_00

Like, are you? Like it's not worth the papers written on at that point, right? So it is important. Like there's a lot of stuff that is uh for show and stuff that you implement. Yes. So with this sort of stuff, I'm always looking at like yes, it can help your business. You should definitely look into it. Like I've said before on on uh previous podcasts. Look into it. You probably can use it somewhere, it certainly can help. Make sure you have the right policies and the right training. Yes. Like don't let your staff just like go like go have at it, right? And hope for the best. That's the wrong. It's very easy to make a mistake and not so. What's gonna happen is there'll be a mistake, data's gonna get out, you're gonna get hacked, and it's not really gonna be their fault. They they haven't been taught or educated this could be a problem. No, you don't just post in client data to analyze into a free AI tool.

Alisha Christian

Because that's the scary thing, isn't it? Is just that.

SPEAKER_00

It is just the data leaks is is a problem, you know. And you're looking at like medical companies again, the same sort of thing. You can't just post in the patient data. Now, do I want AI to give me a second opinion on a scan as a doctor? Probably. So, and then okay, so how do I do that safely then? So that's around policy. The tools that you're using, right? So, again, probably not a free tool.

SPEAKER_02

No, right?

SPEAKER_00

You've gone and found a uh company that is using AI. Um, they you know, you need to go and read their security policies. What are they doing? Or ask the question. It's relatively simple, right? What are you doing to make sure my client data that does get uploaded for you to analyze? What are you doing to secure that? Are you ring fencing it? How's that not getting out into the open? Those are all legitimate, great questions that you should be asking.

Alisha Christian

Yeah, I think it's important for people to get their heads around the value of other people's information.

SPEAKER_00

It's a hard one because quite often uh yeah, I think a lot of businesses have felt like, oh, the data is theirs. And it's it's not. You are the custodian of that data and you need to protect it. And that's why we have privacy laws.

SPEAKER_02

Yes.

SPEAKER_00

But again, it's maybe not a lot of businesses that have read that. And I'd I would absolutely recommend that you get across that because the information commissioner and the government are getting pretty serious about that sort of stuff. There's been some quite large fines. There are businesses being dragged into court going, you did not do this properly. You can't hold that much data and then not look after the data. So it's it's definitely there. I I I think Australia is doing a great job of actually getting that moved along properly. Yeah. There's still there's still a long way to go, but it's getting it's getting there.

Alisha Christian

As long as we're heading in the right direction. Yeah, correct.

SPEAKER_00

And I and I think it, you know, starting that conversation around AI is important. So go do a bit of research, go have a look. And these days you say do a bit of research and people jump on AI.

SPEAKER_02

Yeah.

SPEAKER_00

Be careful with that. Um so it's kind of like the social media ban, right? It's like, you know, not necessarily implemented the best way, but it started a conversation. That's right. And that's probably the most important thing at this point.

Alisha Christian

Um, but we can definitely drop in the readiness assessment um into the show notes. And I think you have a couple other templates that go with that as well. Yeah. Um, so yeah, people can access them.

SPEAKER_00

Uh yeah, it's just to give people a uh a bit of a push in the right direction, making sure they're thinking about the right things.

Alisha Christian

That's sort of for sure. And um have a bit of exciting news, don't we? We've got our first in-person event for the year. Uh so you will be showcasing that, you'll be headlining it.

SPEAKER_00

Yeah, we're gonna be talking uh AI. Yes, and again, just the cybersecurity angle. Um, a few demos.

Alisha Christian

Yes, a few demos. I'm looking forward to seeing those actually.

SPEAKER_00

So I think the most exciting thing is it's um on Friday the 13th.

Alisha Christian

Yes, yes. Love it. I love it too. Love it. What is it? Uh Nightmare on Ann Street.

SPEAKER_00

Yeah, it's a good theme.

Alisha Christian

Yeah, the dark side of fun eye. Have a bit of fun. Uh yeah, so we're hosting that up in Brisbane with the Valley Chamber of Commerce. Yeah.

SPEAKER_00

So I'm looking forward to it. It's gonna be awesome.

Alisha Christian

Yeah, so we'll be um putting out plenty of details about that for anyone that wants to come along to course lunch and get a bit frightened. Um is there anything else, Chris, that you'd like to add about AI and all the things that are probably looking to change?

SPEAKER_00

I look, I I think it's what I've already said. Uh AI is a uh a force multiplier for the stuff we're already seeing. And what you need is education. Yeah. So it is about awareness. Um, you know, again, looking into AI, great idea. Make sure you're doing it properly, go and ask for some advice uh from you know security professionals so they have an understanding. Um, because it's not about no, don't use it. Yes. I mean, it can be, but generally it's like, okay, how can I use this safely or as safe as I can? So what's the risk mitigation strategies I can use so that we can get some sort of leverage out of this? It's there, right? So definitely use it, but that awareness is super important. So don't, yeah, don't sweep it under the rug. Yeah, don't you know push it out the side of your mind. It's uh make sure your staff are across it.

unknown

Yeah, yeah.

Alisha Christian

I think that's the most important thing, isn't it? Because staff are gonna use it regardless.

SPEAKER_00

Yeah, they are. They will already be using it, can guarantee you.

Alisha Christian

Yeah, so it's just important that everyone understands the boundaries.

SPEAKER_00

Agreed.

Alisha Christian

Yeah, yeah. Excellent. Well, thanks again for joining me today. No problem at all. And uh, yep, can't wait for the event. So make sure that I put those details up so everyone can get their tickets.

SPEAKER_00

Sounds good. You'll see them there.

Alisha Christian

Awesome. See you soon.

SPEAKER_00

Thank you.