The Security Circle
An IFPOD production for IFPO the very first security podcast called Security Circle. IFPO is the International Foundation for Protection Officers, and is an international security membership body that supports front line security professionals with learning and development, mental Health and wellbeing initiatives.
The Security Circle
EP 163 The Risk of Being Human: Why Trust, Psychology, and People Still Matter in Security with Deb Anderson, Gigi Agassini and Kehkashan Dadwani
Episode Summary
In this special episode of The Security Circle Podcast, Yolanda “Yoyo” Hamblen is joined by three outstanding women in security — Gigi Agassini, Kehkashan Dadwani and Deb Anderson — to explore a subject that sits at the heart of modern risk management: The Risk of Being Human.
Originally presented at GSX, this conversation examines how rapid technological change, constant incident pressure, and performance-driven cultures are quietly pushing humanity to the margins of security decision-making. While organisations invest heavily in resilience, automation, and controls, the discussion challenges whether enough attention is being paid to the human factors that ultimately determine success or failure.
The episode explores the role of trust and psychological safety in security environments, highlighting why people are far more likely to report concerns, mistakes, or early warning signs when they feel supported rather than blamed. The panel discusses how fear-based cultures, hero mentalities, and burnout increase risk — even in highly mature security programs.
A strong focus is placed on human-centric leadership. The guests share practical insights on how leaders can design policies, training, and communications that reflect real human behaviour, motivations, and limitations — rather than idealised compliance models. From simple messaging changes to creating space for dissent and dialogue, the conversation shows how small shifts can create meaningful reductions in risk.
The discussion also addresses the growing influence of technology and AI, warning against over-trusting systems while under-valuing human judgement. The panel stresses that technology should enhance human capability, not replace it — and that bias, ethics, and accountability remain human responsibilities.
At its core, this episode delivers a powerful message:
security doesn’t fail because people are human — it fails when organisations forget that they are.
The Risk of Being Human is a compelling conversation about trust, culture, psychology, and resilience — and why the future of security depends as much on people as it does on systems.
Gigi Agassini
https://www.linkedin.com/in/gigi-agassini/
Deb Anderson
https://www.linkedin.com/in/debandersenpspcissp/
Kehkashan Dadwani
https://www.linkedin.com/in/kdadwani/
Security Circle ⭕️ is an IFPOD production for IFPO the International Foundation of Protection Officers
If you enjoy the security circle podcast, please like share and comment or even better. Leave us a fab review We can be found on all podcast platforms. Be sure to subscribe. The security circle every Thursday. We love Thursdays. Hi, I'm Yolanda And welcome to the Security Circle Podcast, produced in association with IFPO, the International Foundation for Protection Officers. This podcast is all about connection, bringing you closer to the greatest minds, boldest thinkers, trailblazers, and change makers across the security industry. Whether you are here to grow your network, spark new ideas, or simply feel more connected to the world of protection and risk, you are in the right place wherever you are listening from. Thank you for being a part of the Security Circle journey..
Yoyo:I have a very, very special episode for you this evening. It's not often. I have three of my closest and dearest lady pals in the security profession. Joining me all at once. You might think, oh my goodness, what have I let you all in for? But let me introduce them one by one
Gigi:thank you so much for having us. Uh, yoyo. Uh, my name is Gigi Agassini. I am a principal consultant for J Advisory, and, uh, I help organizations to connect, um, between operations and, um, the business and align everything that they want, physical security, cyber, ai, privacy. Thank, thank you. It's an honor to be here.
Yoyo:Welcome.
Deb:Hi, this is Kay Kahan Wanni. Uh, yoyo. Thank you for the invitation. I'm so excited for this conversation. Um, I am a head of governance at a tech company. I have been in tech for about 10 years, but the same amount of time I've been in security and now I span both physical security and cybersecurity. So I've had the pleasure, um, and uh, a very intense experience of just seeing individuals cross different spaces. And I've worked within the incident space. Uh, currently I'm in the defender space. Uh, so I'm just super excited to have this conversation and, uh, thank you for having me. Last, but not leastly.
Speaker 4:Thank you Yoyo, for having us on. I'm absolutely thrilled to be on my favorite podcast, the Security Circle. My name is Deb Anderson and I am a security administrator at a marketing executable company that specializes in high speed printing. I do both physical and cybersecurity and also safety training. Thanks again.
Yoyo:So listen, I'm gonna explain the reason why we're here. There is a, a particular thing that a lot of people would've missed if they didn't go to GSX last year. And these amazing ladies all produced a presentation called The Risk of Being Human. And after having a conversation, particularly with you, Deb, about this presentation and how well it was received, I thought, what a shame for so many people to miss out on that. So guys and girls, this is why I brought these ladies in to tell you in person, just for you, just for your ears, uh, how it went. But look, we've all got something in common. We all cross over into the tech space, don't we? How cool is that? So, all right. How did the presentation go down, ladies? Why did you decide the risk of being human was the one thing to take to GSX?
Speaker 4:The risk of being human came out of conversations that we had about, the things that were changing in the last two, three years and our work environments and in the world as a whole. There was lots of talks with, DEI kind of being something that we weren't gonna talk about anymore, but we also realized the importance of how that, as part of our culture that we have to, integrate in everything that we do, even though they don't want us to actually talk about it, but it still needs to be a consideration. And so between the tech space and the physical space, we identified, you know, there were some gaps there and we also, realized that, you can't. Avoid, dealing with different personalities, different cultures, different backgrounds, different educational levels. And so we felt that all of this kind of touched on everything that, we could think of in the security realm of things. And we felt that this was an important conversation to have because if you ignore these things, they could grow and, turn out to be, issues for you in the future that could cause, risk to your organization. And so that's how, the theme, the risk of being human came into play.
Yoyo:Kahan, how did the presentation start and why did you decide to start in that particular place?
Deb:So Deb reached out and she mentioned exactly that, right? Like we're talking about. How in the security industry particularly, we have, almost like a hero complex, right? Like we, we wanna be the best. We have to constantly respond to really critical incidents and we have to respond really fast. And oftentimes that means moving at a speed without necessarily thinking about the human being themselves. And I personally wanted to talk quite a bit about psychological safety because I think when you have a population of individuals that is looking to you to help them navigate a really complex situation, the only way they're going to come to you when something happens is if there is trust, right? If there's trust, if they understand that you're not going to blame them, but you're going to work with them. So for me, like when we started this journey, that is what it was about, and it was really helping people understand. Not only why like humanity was so important, but it was also about giving them really clear instructions or like some guidance on what they could implement immediately so that they can see a change in the way that they operated within the security industry.
Yoyo:Gigi, did you have any concerns about speaking in front of so many people? Because speaking at GSX is a pretty humbling thing, isn't it, amongst all of your peers?
Gigi:It is, and honestly, we were more like aimed to deliver a proper message that is important to get in consideration. So I think that that keep us more, concern to, to deliver the, that message because we've been speaking so much about operational resilience, but we do not speak about human resilience. And since COVID I will say that organizations. Become more transactional and they lost their way, the path on being human-centric. On the presentation, we make the point that as humanity, the story begin with those, with us, not with the strength, but with cooperation. And I think to a deb and a Kehkashan point, we lost that, uh, that objective as society working with us. Kehkashan just said something about trust. And trust is what we lost as well as a human. Humans do not trust in human. We'd rather trust in technology. I will let this here because that becomes more with AI and so on, but we lost that trust in people and we need to keep. Back that and back to the basics to start that cooperation together if we want to be resilient and successful as a so society.
Yoyo:Gigi, I understand that you led the presentation with the team. Why don't you start there and, take us through.
Gigi:Yeah. I will remind our audience, your audience, Jo and us as well, that if we just think about hunter gatherers and we back to that era, they create because they cooperate and they create a strong civilizations. Altogether we can build very strong things, technology and so on that produce, of course, advance and more progress and, that expanding in the world. And that's, and that cooperation between not only locally but with others. And that keep us like the base of this, of being human to that mobilization from one place to another to create exactly that. So, um, if we back to that, we need to create those safe space for people to feel safe as just concussion appointed to psychological safety that they belong and they are not being judged because how they look, because, uh, how they speak, because of their religion. Just seeing as it is. As a person, as a human with a different background and different skills.
Yoyo:Why don't you take us through now the presentation in terms of, you know, how it went from Gigi's lead.
Gigi:so the first question to, um, Kehkashan We'll see in an environment where innovation moves faster than most risk management processes, how can leaders incorporate the human perception of risk? Often subjective by earth, an emotional charge into strategic decision making frameworks. Without slowing down progress or neglecting ethical and cultural dimensions.
Deb:I absolutely love this question. I think there's so many dimensions around, you know, the, the human space and the business space, right? Because ultimately the objective of a business is to create revenue, right? It for them is to develop the product, produce the product, do everything in its power to be successful, but the bread and butter of that organization are the people themselves, right? So when you start thinking about the side of security, the, the objective of security is to ensure that you're reducing risk and you're increasing trust. And the best way to do that is through your people. Um, one of the ways that, um, I started really digging into, um. Really preparing for this discussion was through the data. Um, and I stumbled across, uh, this, this, uh, project called Project Aristotle that Google ran, um, across 180 teams across the company. And Google is obviously a massive tech company. It's wildly successful. And what they found based on their data was it's not the speed at which you produce. Your outcomes or how well your processes are, or even the exceptional of your people, but it's about the psychological safety that you create amongst your teams, um, that helps them really feel safe. Taking risks, right? Because if you wanna go high, you can't just keep going down the status quo. You have to ensure that you're able to be bold and you're able to just be incredibly courageous in the things that you do because you understand that the leaders that are guiding you and helping ensure that you can do everything that you want will not. Break you down because there was some sort of failure, or that they will completely abandon you when something didn't go the way you had intended. So that psychological safety creates dependability, right? How can I depend on my leadership? How can I depend on my teammates to ensure that the work that is happening, especially in the security industry, like your processes, are never going to match the, uh, the, uh, threats that you are seeing within your environment? Because that's how, that's how the security industry evolves, right? Something happens and you have to evolve with it. So for that to happen, you have to provide clarity. You have to provide structure within your teams to ensure that when they make, take bold decisions, they still have a framework that they can follow. And then the final two things that you think about is like meaning and impact, right? Like you can't just be like, oh, you have so much safety within the work that you're doing. You have to ensure that the work that you're doing has meaning for the individuals that are working on specific processes or projects, whether it's an incident or they're an operational team, you have to make sure you give them meaning within their work and you ensure that they understand that the work that they are doing has an impact on the company's success, right? Because that's when you know that the team members are going to be excellent in the work they do, and that has nothing to do with their skillsets or even, you know, their pedigree. From an educational perspective. It very much comes down to how well are you able to work as a team.
Gigi:I will say that after Tenian comment, I think that she just shows how trust is very important and how we don't have that trust anymore. So we need to rely on our leaders. But as a leader, you need to rely also in your people. And that is just gonna happen if you trust and create that trust because it's not automatically and not in this polarized world that we lived in. So I will say that there is a tyranny because that cooperation that they gave us at that time, the power to create civilizations. Today and create that. The stories as a religion, nations money, and so on, um, give us progress. And now it's playing on the contrary because the same stories that unite us to are dividing us so to depth, how can we take a human centered approach when creating policies, integrating technology, and improving employee engagement and overcoming cognitive bias.
Speaker 4:This question resonates with me so much because I, I practice this in my day-to-day work activities. And the biggest part of taking a human-centered approach is intentionally designing with people's behaviors and motivations and limitations in mind. And I'm gonna give you a little example. Um, so I, again, I'm in charge of making policies and, and, um, you know, making sure everybody's safe and secure and, you know, all the rules and regulations are being followed, and I can share that messaging with you all day long. But if people don't understand it or fully understand what it means to them, they're really not gonna participate. One of my favorite quotes is from Nelson Mandela, and he said that if you talk to a man in a language he understands, that goes to his head. But if you talk to him in his own language. That goes to his heart. And it doesn't necessarily mean we have to all learn a second language. It means talking to them in a way that they understand. And my example that I wanted to share with you is, um, one of my, um, my favorite things to talk about in any of our meetings is, you know, don't prop the doors open. You know, that's, that's the number one thing that you can do to keep your, your site secure, is by keeping those doors locked. And so my messaging was don't don't prop the doors open. Well, it went in, in one, in when, and out another ear. And so folks would just like, yeah, whatever. So then I thought about it. I'm like, well, let's change the messaging a little bit and let's, let's say, hey, don't, don't prop the doors open, because if you do, it could lead to people coming in our building that aren't. Don't need to be in there. There could be critters coming in. There could be a reduce in airflow, could weaken the door, and that helped a little bit, but not a hundred percent. I still had problems with people propping the doors open. So I took back a couple steps and I thought about it and I remembered my journalism professor always saying, show don't tell. I was telling people I wasn't showing them. It wasn't resonating with them. They didn't have any buy-in. And so I changed my messaging to reflect that, you know, why are they propping these doors? Open up. I put myself into their shoes. Okay, they're propping them open because maybe they had to work after their shift and they don't have access anymore, or maybe they're carrying a large uh, box or something and they need to be able to get in and out that door. So I changed my messaging that addressed each one of these issues that could cause this. And it's made a difference. It's made an impact. It, it gave them buy-in, it gave them a sense of belonging that they could help participate in helping keep us secure. At the same time, we're all being responsible for keeping everybody safe by following this new messaging. And, um, I really feel that if you take the time to talk to people and discover, you know, how do they learn what makes sense to them, what's important to them, and structure your, your messaging or structure your training to them, you're gonna get the buy-in and you're going to get, people are gonna feel more important because you included them in a way that they now have a voice and that they are not just. A fellow worker, but you're seeing them, you, you're letting them know, I see you, you are important, and, um, you have an impact into our team.
Gigi:I really love what you just, um, mentioned that, because that just bring me to, everybody has a cognitive bias and the way we see unperceived risk is gonna change. That perception depends on the person. You don't see perceived risk as any of us here in this amazing conversation. So, um, as we just close, uh, these, uh, questions, let's remember that humanity's strength has never been technology, but our ability to cooperate from fire circles, as we mentioned at the very beginning. Progress has always come from trust and collaboration. That is being the main topics on this conversation. So the last question for both of you ladies. How we put too much focus on resilient business and not enough on resilient people? And what's one practice leader should implement tomorrow that changed that, or is starting to change that?
Deb:Deb, you wanna go first or you want me to? I'll let you. You know, as leaders, our job is to be translators, right? Like we have to translate the requirements from our executives to ensure that our security practitioners understand what is important, what is the top priority, how are things happening. This means that not only is it your job to translate business requirements, it is also your job to translate human requirements. The, the WHO, um, estimates that$1 trillion annually is lost in productivity because of burnout. So as leaders, it is our responsibility to see our people, right? See the individuals who report to us, who may work with us, whether they're our colleagues or peers, individuals who may report into other people and really see them as human beings, like are they struggling? Like, understand who they are as as humans and ensure that they're taking care of themselves, right? So something you can do very easily tomorrow is just. Tell your people to take care of themselves, right? Like there is a, hey, there's a priority, there's a fire, there's an incident. Definitely you need to take care of that. But in the same vein, you have to express to them why taking care of themselves is just as important as taking care of the company. It's the example of, you know, if there's an emergency on a plane, you can't expect to put the mask on the child first and then put your own mask on. You might be passed out by then, right? Like, you as a, as a employee, you as a human being have to take care of yourself first so that you can take care of the company. And that is something that I think everyone can do. It's very actionable and it may seem like something that's so small, but your employees will, will commend you for it because you can show that you actually care beyond just the numbers, beyond just the results of whatever it's you're trying to achieve.
Speaker 4:I like what you said there, Khan. And it, it got me to thinking about, um, years ago, I Gallup had this thing called the Q 12 survey and one of the questions that they had, which is one of my favorite ones, do you have everything you need to do your job? And first things that come to mind is, oh, I have a computer, I have a desk. You know, it's, it's the physical things. But with our conversation now is like, do I have the support of management? Do I know what my roles and responsibilities are? Do I know. What my limitations are, do I know how to communicate effectively? Um, and one of the things I always like to ask when I start, when I start a new position and I have a supervisor, I always like to ask them, how do you like to be communicated with? How often do you like to be communicated with? Do you prefer a text? Uh, in person is once a week. Okay? Um, once a day, you know, what, what, what is the, the synergy that you want? And, um, I found that to be very effective. But just knowing, you know, what, what your guidelines are, and then knowing that, you know, your supervisors, your fellow employees can help you support those, those, um, those things in your toolbox that you have so that you can be successful and that you can be, um, an important role in your organization.
Gigi:I really love, uh, the, the answers and I will say that it's important, yes, the message, but as, as we did at AT GSX is the others that we need to be more concerned to the person that is gonna deliver that, that is for the receiver. Well understood. And uh, what you just said there, that taking more in consideration the people Yes, but without any like, afraid to be that if I say something, it's gonna be harmful. So it doesn't matter that you like, express that uh, you're not agreeing something, nobody's gonna harm you because of that, you're gonna be safe. I think that one is an important point. To become more human centric. That is being all what we've been discussing. So in this fast paced complex world that we live, it is impossible. Not to mention that AI and technology amplify both our potential and our risk, but the real challenge remains human. And the risk of being human is also our opportunity to rebuild trust, to foster belonging, and to ensure resilience that starts with people.
Yoyo:So I'd love to ask you a question. Look, we're all humans, right? When we go to work, we're all working with humans. Unless you work at a zoo, obviously some of us do feel that we work in a zoo every day, however. How can we, I'm gonna ask each of you for your opinion on this. How can we all forget to put humans in the middle when humans are in the middle and we're humans, yet we seem to have to remind ourselves and people to be human-centric.
Deb:I can give this answer pre, pretty immediately. I think, it's really interesting when you work in the world of alerts and incidents and you're constantly being bombarded by technology, right? It's trying to tell you so many different things, and you have to use the data to be able to do X, Y, Z, right? You may work with a person right next to you, but your world in your career is attached to a computer. And oftentimes when you go into an office, you might even be meeting with people who are across the world or in a different, you know, in a different state or a different city. And you're having these conversations with them over the internet, very much like this, right? Like we're having a discussion with each other over the internet. And when you lose that sense of like physical touch almost, that you can see the person right next to you, they become this unmarked individual. Like, it's almost like they don't matter. And I think like the further we get into this world of, I've been in social media for 10 years as like an industry, like social media itself has desensitized, desensitized us to humanity itself. And for those of us who grew up during dial up or didn't grow up, you know, like having the social media, in front of us all the time. I think we, we have to actively remind ourselves that the people that we are working with are human beings. Like you have to assume good intent. You can't say like, oh, this person is such an idiot because why would they even do that? They're out to get me or they're out to get my team. No, it's have a conversation right at the end of every problem you're probably having at work. There's likely a misunderstanding. There's likely some sort of negotiation that could happen where it could be a win-win situation like security shouldn't be a zero sum industry. There has to be like partnership and negotiation on both sides.
Yoyo:to follow up, Deb, if we don't remember the human in the middle and we have to remind ourselves all the time to have humans centric to everything, it's just gonna be like mutually assured destruction. Right. Right,
Speaker 4:right, right. And to, to Keisha's point, I think a lot of us in the security industry, like she said, we're connected to our devices and we're like on 24 7 and almost like we're running 110 miles an hour. We're almost like these machines and we have to stop and remember that we are human. We need to step back. We need to have that downtime, that stare at the ceiling time and to. Also treat everybody else as recognizing them as humans because I find that having personal conversations, you know, taking a second to get up from my desk to go talk to somebody, you get the emotion. You can read their faces, you can read their body language. You get so much more out of the conversation. If it was just on a computer, on a text, as most of you know, you don't know what's the emotion behind it, and you could totally take it outta context, but just by taking a second to have that real conversation, to be in person with somebody, it makes a huge difference. And I think that goes back to GSX, where, you know, we all get together, we're in person, and you know, we may have been having emails or Zoom calls for the whole year, but when we're together, it. Solidifies those connections and the deeper understanding between, you know, all of us and what we're doing. And I think it, it makes a huge impact to don't forget that human connection and be human and embrace being human because that's, that's all you have at the end of the day.
Yoyo:Gigi, I'd like to give you an opportunity to try and perhaps give an example of where you've had to remind somebody or be reminded that the human-centric element has been missing. Have you got a story you can tell around where the human element was missing and needed to be brought back to life?
Gigi:Yeah. There is so, so many examples about that, and I want just to give an example that is a public situation that it happens with a, Hack attack, cyber attack at ransomware that one of the companies had a global company. And they, in with them, it shows that culture can eat a strategy for breakfast. And, um, they decided to hide the thing and just keep it very, very, in a petite committee without even keep it acknowledged to the board regulators and ities and so on. They hide it for more than a year. And when you hide something like that, it is impossible to hide a ransomware. Right? Uh, so none of the, people that were harmed when all the leaking of the information were, public, none of them were, notified that that happened. So. To me that is a great example, that culture is very important and culture is made of people. So what is a culture that you're creating within your organization? You, and that's why it's very important, the human centric, because you are not hiring diversity, you are hiring different cultures. And if you do not align a culture, taking in consideration all of the cultures that are part of your organization is gonna like end like this. Culture is gonna, it is strategy for breakfast. Simple like that. So we need to be aware that diversity is not about statistics. 20 of these, five of those, that is not about that. Diversity is to have people with different backgrounds, skillset. It's like being, doing, uh, cooking every single recipe. You don't utilize the same spices, right? So depends on what you want to cook, is the spices that we're gonna use. So what is the dish that you want to prepare in that moment? Oh, my dish is gonna be like a incident response team, so I need to use that spices to create that recipe. But taking in consideration that the culture and value piece, that's very important. And I think that organizations are losing that light, um, on the tunnel. And it's just going more, low and low and low just until to the dark.
Yoyo:So Gigi, we sit on the a SIS, uh, information Security committee. And I, I love being on that committee with you. You're amazing. So for anybody who isn't on a committee and would love to be, please get in touch with us because it's really good to volunteer for our membership body. But we often talk, don't we, about how technology is becoming so invasive. And so with that in mind, how do we stop technology becoming so invasive and still retain our humanity? I
Gigi:love that, that question seriously, because Yeah, just,
Yoyo:well, you are on the security circle. You're gonna get a few more,
Gigi:just a concussion made that, uh, that comment, right? So we are so like, let me just rephrase that. Our risk, our appetite risk as an individual. It becomes differently on this digital world, on the physical world, and in the personal world and in the digital world where we work even, it's completely misleading for the human part because you need an answer, because you need to respond because you are just leading with a crisis and so on. And it's where technology, yes, it's getting invasive with AI agents, ai, and I don't need you because I can't just, do this, uh, process automatically and it's gonna take your position on. So I think that yes, technology, we need to, again, back to the basics and use it as a tool. So we are gonna give that tool to humans to be better on what they need to be better. But keeping in mind that the first layer of this is the human factor technology just is gonna help us to be more precise, to get that progress, to continuing that innovation path and become more resilient and try to like have a 360 view on risk and everything that needs to, that the, in the business needs to be aware of. But not replacing the human part. You need the human, if you, I mean every human has the cognitive bias we just mentioned that how we are expecting that the technology is not gonna have that.
Yoyo:Well, it's being designed with bias. That's the problem. Exactly. The algorithms are being designed with bias. And we have yet, I think, to touch the tip of the iceberg in terms of that, and, I'm not gonna mention the name of the tech business, but there is a tech business that now knows that there algorithm is, favoring, angry click bait. But they can't remove that without removing the enterprise model for their business to grow. And so therefore, they have a bit of a dichotomy. How do you as a business change your business model? To make something ethically better when it completely contradicts what your overall objectives are as a board to deliver profits and to deliver a sustainable business going forward. So being very neutral,'cause I think we all sit on the ethics chair. We should not be encouraging any tech that has biased algorithms or algorithms that ultimately, you know, promote the worst in mankind because they've now proven that that behavior is systemically, divisive and will break up communities. But does anybody wanna come back on that?
Deb:I have so many opinions on that. But I think one thing that I was like, you know, It's our responsibility as, um, as security professionals to have integrity, right? Like you think about, um, you know, you, you see Deb is A-C-I-S-S-P, I'm A-C-S-S-P. I think yo-yo, you, you have A-C-I-S-S-P. Um, there's plenty of professionals who are CPPs, right? Like we, we, when we get those certifications, we take an oath to be, uh, professionals with ethics and with integrity to be in the adults in the room to help our leaders understand when something is going. Something, especially in a business model, is moving in a direction that will harm society. Right. The, the very first thing you should always think about is do no harm, because ultimately anything that you put into society today is going to have an impact on our overall society tomorrow. So I'm not aware of, you know, what, which company this is that has understood that their, um, algorithm is like pretty much based on, on rage bait. But I think as security professionals, like if I was in that organization, I would think it is my responsibility as a security leader to ensure that company leadership understands the risks that they are taking on to increase revenue today. Right. Like you're, you're talking about such a short term. Increase in revenue to a long-term impact on society and human mental illness. Right? And I, and I think that has, that is something that we as security professionals just have to take on as individuals.
Yoyo:Hey, Kate. I, I'll leave it there. Very much, very much. Kind of gonna put a lot of pressure on you now, but I see you being, uh, yo pro next gen. I'm expecting you to change the world, lady. Fingers crossed. I wanna see you as the next CEO of the tech company we haven't even imagined is created yet. I have every hope for you, so Debs, let's come to you in terms of your views around where you see technology absorbing and literally suffocating the human space in terms of how we think, how we, we we're creative, how we innovate and how we deliver. I
Speaker 4:think it's happening already, you know?'cause we see so many people, they're just involved AI with the video games and the cell phones and people don't talk face to face anymore. And it's, it's, it's happening. But I think it's, it's a tool that can be used for good. Um, but I think we're losing sight of that. And we need to, you know, comb back a little bit and take the time to realize we can use this. Not as taking over our lives, but as a part to accentuate our lives and to make things more efficiently. Uh, I use it a lot for, you know, I'll write something and I'm like, eh, I wanna figure out a way to make this sound better. And it's, it's been very helpful when you, you want to, you know, draft an email or a note or something like that, but um, when you let it inundate your life and take it over and you can't go five steps without looking at your phone, that's a problem. So I think it's a tool that can be used for good, but I think we really need to be careful with it and, um, be cognizant of how it's impacting and changing people's lives. there could be a lot of good that can come from it.
Yoyo:Gigi, what are we gonna be focusing on, do you think, for 2026? Certainly from an a SIS information technology perspective. In terms of how the adoption of technology including agen ai, for example, into business practices. I mean, look, I was in a conversation today where it became known from one professional, that another professional had clearly used chat GPT to create something quite formal. Now, there's nothing wrong with using chat GPT to create something quite formal, right? As long as you are bearing in mind, the privacy element, the IP element, and making sure that you don't reveal anything you shouldn't in terms of privacy and security. But this person had even left the prompt in the text that they put into the dock, Khan's nearly falling off a chair. They left the prompt in IE, can you help me and explain blah, blah, blah, blah, blah. They, because they didn't sense check what they were delivering. Now there's one of, one, if not more of the two of the big six accountancy firms has just got caught out using CHATT for producing their formal documents, which means they are creaming profits by not doing the hard work, the hard labor, there's no doubt about it. Chat, GPT co-pilot, they're all great tools. They're assistive technology. But give some advice, Gigi, on how we can very easily get rumbled if we don't use this responsibility.
Gigi:I just love the example because it is so many and there is one that has become globally like kinda scandal because yes, it was like, one of the nicest, consultancy companies that, anyways, so for 2026 I will say that we need to, um, slow down. Technology is not gonna stop evolving, and that's okay. But as uh, individuals and organizations who need to slow down that fast pace, where we lead in to give time, to understand, to adapt, and to react in consequence and react, I don't means like, oh, if we are in a crisis and so on, no. How we can create a framework, an internal framework that is gonna take care and manage better. Our risk related on technology, ai, and so on. It is through that we are so dependent in technology right now, and we become more and more and more and more and more with ai, right? So AI, as you just appointed before, this is just an amoeba. What is gonna become, what is gonna be that and how fast we don't know yet. But, um, my advice, it would be be more, I mean, there is, there is no, nothing bad to use those tools, but be more aware of ethics and privacy. Why people are willing to give the keys of your home in the digital world if you are not willing to do that in the physical world. And that's exactly what people does. Giving all the personal information and feeding the model because they don't understand yet. Because there is not that time to absorb and adapt on what is going on right now. That all the, every single word that you put on the prompt is feeding the model. So if there is privacy information, sensitive information and on and on and on, that is gonna continue training the model. There is not any guarantee that your privacy or sensitive information is gonna be safe. It's, that's why at the very beginning we said between humans there is no trust. But when it's based on technology, we're overconfidence. We have that switch internally. It's like the psych is just playing with us that, that we are safe because you don't see any anybody in front of you that is not the same when you are in front of somebody. So my advice is that, one, take your time, make a framework, don't buy a product. Analyze first your necessity. If the technology that you are thinking of to implement how is well gonna fit your needs and how you can create that framework to manage the risk related on that technology, whatever the name is of, of this one.
Yoyo:love that, uh, Gigi. Crikey. I mean, you were already very high in my estimation and you've now gone up even higher. You have such a beautiful way of articulating things. Kk, you have admitted to being in the social media space for 10 years. You must have seen an awful lot of change. I've also been in the social media space, worked for a major provider and I can say that there has been a horrendous amount of change and it's not always been for the good. So I'd like to ask you if, and I know this is a big question, but if you could wave a magic wand and change one thing tomorrow, what would you choose?
Deb:Okay, this is gonna be controversial.
Yoyo:Oh, no, really.
Deb:Um, if I could wave a magic, magic wand, I would love for people to take accountability of their own actions and their own words. We hold, like social media is like such a big word right now, right? Like it's this new way of communication and how we engage with each other. We build communities. There's so much positive. Influence that comes out of it, because now you have people who previously couldn't talk to each other or learn about different things. Like I learn about things off of social media all the time. Obviously, I am validating my sources and making sure that I'm doing my own research before just blindly, you know, trusting everyone. But I wish that people would hold themselves accountable to the things they write, the things they say, the type of content they put on these platforms. Because ultimately, social media is a tool, right? Is it is a tool that is used by individuals, by human beings, by corporations. And the impact that it has is based on the individuals that are using it. The social media companies can put very clear guardrails on that platform to ensure that it is used in a very specific way. But there's like 8 billion people. On the planet, right? Like how do you make sure that you hold every single person accountable? My magic wand would be I wish that people would hold themselves accountable and understand the consequences of the words that they put on the internet, because especially our younger population, they are so impressionable. I will even argue our older population is incredibly impressionable. So being held accountable for, their content like yourself, I think would be an incredible move in just society itself. And I think it would make social media that much better. But alas, we live in the society that we live in, so we have to be, as organizations, be responsible for the community guidelines that we put in, be held accountable to privacy regulations, ensure that we are. Adopting, very hygienic data privacy practices that we have really clear policies in place that protect our users, protect the platform, and ensure that the corporations that are using these platforms are also using that them appropriately.
Yoyo:Kk you like me, are a steward of Centropy. We all know that entropy is the breaking down of order, but centropy is a tendency towards increasing order. And I think we all have that quality being in the security industry to a degree. It's complex and you know, it's the opposite to decay. I always say to people that I'm a little bit Das Vader, really, in terms of order in the universe, just less death and murder. But let's look realistically at the five stages of warfare, and I have an amazing guest coming onto the Security Circle podcast soon to talk very specifically to the five generations of warfare. Gigi, I want to ask you in terms of how warfare has evolved, and you must be thinking, good God, where's yo-yo going with this question? We've had the different stages of World, world War I, world War ii. We've had maneuver warfare. World War ii. We've had the generation of war on terror. We've all lived through that. But the current, generation of warfare that we're in is the, mind occupied territory. The front line is the mind of every citizen. And this is, very important when it comes down to how we allow the very negative dark side of technology and social media to influence that mind. We heard KK earlier talking about taking responsibility and accountability. The problem is human nature is not always going to take accountability for itself individually. So what do we need to be thinking of as we are stewards of, trying to bring about a bit of order,
Gigi:much to be unpack on, on this one, but I will say, I, I will try to keep it simple. And the first one is, um, even though yes, humanity as a human being per se is, is gonna like challenge the risk in that space and the gap of the generations, right? Being on the digital world, on the social media, I mean, it, it's clear that the young people are more willing to be flashing for those influencers and anybody can become an influencer on the social media. However, people from my generation and others are more believing on what is on social media. So how would you close that gap between, between those, uh, I mean, all that is in, in, in the middle. That's, that's very hard. So I will say, don't believe everything that you see. Always check the source if you are not gonna take that responsibility on social media. Don't publish anything, don't harm others. Don't use that information to harm others. And, uh, not because you are hired in, in a user, that means that you have right to harm people publicly. So if you're not gonna take that, that responsibility as just a concussion set, if you're not gonna take that ownership and accountability. So, um, just. Close your social media, don't be there. Uh, but that is very difficult to, to achieve. So, um, don't use the information to harm others. I will say that as principle of trying not to, um, continue broken that trust. Um, and, um, that's, I, I, yeah, so recapping that. Um, don't believe everything ju see, take always in mind that there is so, I mean, powerful tools to create misinformation. Um, if you are not gonna take that ownership and accountability on social media, don't post anything and don't use information to harm others or create polarization that does not add anybody to anyone.
Yoyo:You know, you are right, because I remember during COVID, we were at the stage still where you could challenge inappropriate, content. You could say that's false. Check your facts, use authorized websites, use factual, accredited media content. But now it's like dreadful. Like I don't even wanna be the voice of reason in some social media forums. I don't wanna be the one that says, stop being a dick. Yeah. Because all you do is you just bring back all of this kind of stuff on yourself and you, so you find yourself thinking, you know what earth's the point? It's like a, it's like a ca of just toxic negative, rubbish. Debs, I'm gonna come to you you've been in the technology space now for very, very, very, very, very, very, very, no, I'm just kidding. I was 12 when I started. For such a young, for such a young person, what is the one piece of advice you would say right now to a young professional wanting to make their way into the security tech space as a generalist may. To say, look, you know, have your wits about you. This is what you need to go out into this big wide world. What would you advise as a wise old al and mentor?
Speaker 4:I love this question. I was actually on a panel a week ago, with the IT group and was asked the same question and it's perfect. The suggestion, the recommendations I would give is be, fluid because you might get into the space learning one thing, but don't just stop there. You have to be open to how technology's changing, but you also have to have a sense of creativity because not all problems or issues can be solved with technology. Sometimes you have to use a little bit of innovation to solve things, so just. You know, stay fluid, be open to learning new things. Learning is the number one thing. Always be a constant learner. We are all lifelong learners and that I can attest to throughout my entire career. My bachelor's degree is in journalism and back then I never thought I'd be in this space ever. And just by being curious about things and wanting to learn new things and evolve, led me to where I am. So, like we mentioned before, many of us have our CI i SSPs, and one of the big things for maintaining that is you have to meet so many educational CPEs requirements throughout the year. And I'm glad they have that because it helps us to stay up on what's going on. It opens our minds to new technology and new issues, and it allows us to grow and evolve with the security world and the technology spaces. We see it.
Yoyo:But look, whether you are required to or whether you're not required to, self-development is key. Yes. And to be in this space, not only, I mean, you hit the buzzword there, the big C word, curiosity. Mm-hmm. Like you, I also have a background in journalism, not because I was trained or I have academic, qualifications because I wanted to be a journalist, and then I ended up becoming a police detective. So there's a lot of similarities there. In the terms of wanting to get to the truth. Yes. Wanting to ask questions for things that you don't know and to seek clarification, in fact. So I love the background that we share., Look, I'm gonna say also that, I'm currently doing a side course on cryptography, which is killing my brain. And I'm also doing a bit of self-development on identity access management. It's an important thing to know about. So I think I would encourage everybody as my little kind of take really to, if you don't know something, find a little bite-sized course on it. Just brief yourself. And I encourage all physical security professionals remember to move into the cyberspace. Gigi's in the cyberspace. You know, Debs is in the data cyberspace. You already have 70% of the qualifications. You already have a risk mindset. Ladies. I'm just gonna say, I feel like I have to see you in 2026, and I have to be at GSX in 2026, and I think. We need to set up a gig together. I think we already need to start thinking about what we should be presenting and we should just present a force a good in this space. Alright? And I would love to see you and give you a big hug as well'cause it's been a long time since I've seen you
Gigi:yeah, I just want to add a few things, a couple, a few on in regards of, uh, educate education and curiosity, um, and the usage of, uh, ai. Just taking a little of, uh, my comment before there is some research is not, I mean, it's not done yet and there is not enough, uh, information how it's gonna affecting using ai. To, um, be, uh, lacing your brain as, um, just defamation your innovation. Um, and that's some, some, I mean, candles that you need to continue keeping up. Uh, so yes, um, if you don't know something or you have curiosity to know some topic, it doesn't need to be related with, uh, technology or security. It can be another thing. Just keep your mind occupied in something that continues creating your brain, your ideas, and, uh, keep you, uh, like interesting in something. And, um, yeah, I, I, uh, I think it's, uh, important for us as a human not always to rely on technology. This connection, you know, information is like. So how to, for food that you're just, just giving to your body. You need time to digest. And it's the same with the brain. You cannot continuously giving a lot of information that probably most of that is garbage, how you're gonna identify that garbage if you don't even give it that time to digest. Uh, I, I think that one of my favorite, uh, writers always said, we need to keep the mind that information diet. So keep the micros so the macros, like very well identified, and keep the, um, the processes, the information that is that misinformation, that something that is not helping you in anything, in any way a part of you, because it's not gonna, it's not health for your brain. And your brain is part of your health. That holistic thing in your. As a human in your soul and in your mind. If you want that holistic health, it is important also the information that is the, the food on your brain.
Speaker 4:The last words I'd like to share with you is, uh, something that I mentioned in our presentation too. It, it's, it's a very simple action. It's a good way to connect with people and it's just a smile. You don't have to speak a certain language. You don't need to be having a good day, but sometimes if you just see someone and you give them a smile, it's a way to connect with them and say, I see you. Um. You are human. And, um, I acknowledge you. And it just brings people together. It's not the solution to a lot of the turmoil we see in the world today, but it's a good start. And, you know, maybe that person needed that smell. You never know. You may have just made their day.
Yoyo:I, and you are such a consumate professional, Debs.'Cause Gigi and I were making faces at you whilst you were saying that, and you kept professional all the way through. Well done. KK iss caught, caught a plane, by the way. Uh, so she's left early. But listen, ladies, let's just rock the show Can't wait to work with you again. Each of you, you've been amazing. And I presented with kk, you know, last year at gsx. Mm-hmm. And a very, very, cool colleague to work with. So ladies, all the best, for 2026. And thank you so much for joining us on the Security Circle podcast. Thank you.
Gigi:Thank you so much for having us and have a great, great conversation. Thank you all.