The Security Circle
An IFPOD production for IFPO the very first security podcast called Security Circle. IFPO is the International Foundation for Protection Officers, and is an international security membership body that supports front line security professionals with learning and development, mental Health and wellbeing initiatives.
The Security Circle
EP 161 Mind Sovereignty: A Former CIA Deputy Director on Trust, Deepfakes, and the New 'Trust' Battlefield
Episode Summary
In this episode of The Security Circle Podcast, Yolanda “Yoyo” Hamblen is joined by Jennifer Ewbanks, former Deputy Director of the CIA for Digital Innovation, for a wide-ranging and deeply human conversation about intelligence, trust, and the evolving nature of security.
Drawing on more than three decades in human intelligence and senior leadership at the CIA, Jennifer reflects on her career operating in complex, high-risk environments across the globe — including her experience leading enterprise IT, cybersecurity, AI, open-source intelligence, and secure global communications at a time of unprecedented digital change.
A central theme of the discussion is trust — how it underpins intelligence work, leadership, and decision-making, and how it is now under sustained attack. Jennifer explains why the traditional security perimeter is no longer made up of networks and devices, but of human trust itself.
The conversation explores the accelerating threat of deepfakes and AI-driven manipulation, including how low-cost, high-quality synthetic media is reshaping fraud, influence operations, and executive decision-making. Jennifer introduces a practical framework for navigating this new reality, shifting the question from “Is this real?” to “What do I do before I act?”
Yoyo and Jennifer also delve into mind sovereignty — the ability to think independently in a world increasingly shaped by algorithms, emotional manipulation, and engagement-driven platforms. They discuss how information warfare, societal polarisation, and technological convenience are eroding critical thinking, empathy, and resilience at both individual and organisational levels.
This is a powerful conversation about the fifth generation of warfare, where the frontline is no longer physical or digital, but cognitive — and why protecting human judgement may be the most important security challenge of our time.
About Jennifer
As CIA’s former Deputy Director for Digital Innovation, she led the transformation of one of the most complex and secure digital ecosystems in the world, driving AI, cyber, and data strategies that positioned the United States for long-term advantage in the digital age. That experience now informs her work with boards and executives navigating high-stakes challenges at the intersection of technology, security, and leadership.
Today, she advises boards, executive teams, and founders on the convergence of national security, emerging technology, and enterprise risk. She brings decades of operational leadership together with practical expertise in AI, cyber, and digital transformation to help organizations build resilience, govern AI responsibly, and understand why geopolitical and cyber risks are now board-level imperatives. Her work is forward-looking: preparing leaders for quantum disruption, counter-AI threats, and an era defined by competition between digital freedom and digital authoritarianism.
A growing focus of her work centers on Mind Sovereignty™, or the ability of individuals and institutions to maintain independent judgment, agency, and clarity in a world increasingly shaped by algorithms, synthetic media, and adversarial influence operations. As manipulation becomes more personalized and more subtle, protecting freedom of thought is no longer abstract. It is a practical leadership challenge, with implications for decision-making, trust, and building resilience.
https://www.linkedin.com/in/jennifer-ewbank/
Security Circle ⭕️ is an IFPOD production for IFPO the International Foundation of Protection Officers
If you enjoy the security circle podcast, please like share and comment or even better. Leave us a fab review We can be found on all podcast platforms. Be sure to subscribe. The security circle every Thursday. We love Thursdays. Hi, I'm Yolanda And welcome to the Security Circle Podcast, produced in association with IFPO, the International Foundation for Protection Officers. This podcast is all about connection, bringing you closer to the greatest minds, boldest thinkers, trailblazers, and change makers across the security industry. Whether you are here to grow your network, spark new ideas, or simply feel more connected to the world of protection and risk, you are in the right place wherever you are listening from. Thank you for being a part of the Security Circle journey..
Yoyo:Jennifer Eubank, welcome to the Security Circle podcast. How you
Jennifer:doing? I am doing great. I've been looking forward to our conversation for some time. Yoyo. You've been looking forward
Yoyo:to it. I've been telling everybody that I have got a former director of the CIA coming onto the Security Circle podcast. Still a little bit of a professional crush here, to be honest with you. But Jennifer, look, deputy, deputy, deputy. What did, what did I say? Director, but do you know? That's all right. Well, it's still huge. Thank you for the correction. Look, it's important to set precedent from the outset. You've held an incredibly important position in intelligence. Tell us a little bit about, your career from a helicopter perspective and why, you are here today.
Jennifer:Oh yeah. Thank you. So, um, the, the elevator version is that I spent over three decades in the world of human intelligence and later digital tech. So my job for decades was, you know, I. Move and operate and live around the world and collect secrets on the plans and intentions of America's allies, um, adversaries, excuse me. And, um, uh, to support our allies in doing that. And I've capped off that long career, uh, with a final role as deputy director of CIA for digital innovation for about four and a half years, from 2019 through the beginning of 2024. And that, uh, encompassed. Enterprise IT, global, secure communications, cybersecurity, open source intelligence, data, artificial intelligence, a um, uh, training and education mission for the CIA as a whole. So thousands and thousands of people, uh, globally deployed billions in budget and, um, I always. Say about, say this. Um, I believe it to be true. It was a period of really unparalleled complexity in the digital landscape. And so a lot of challenges, a lot of opportunities, a lot of success, and just amazing people I got to work with along the way. And I think I'm here because, well, because you invited me first. Um, but second, maybe just to talk about some of those, those big issues that we're facing across the digital landscape.
Yoyo:Yes, because I already know that, uh, you are not, you, you are highly trained to, uh, what, what was it you said to me just now? You said I've been asked many years how, how to, how, how to answer questions or not answer. Oh no, it
Jennifer:was, uh, I spent over three decades, uh, being asked questions that were impossible to answer. But, uh, yes.
Yoyo:But I always found a way you talk about unparalleled complexity and I think in our conversation today. We're going to highlight the fact that that hasn't changed in terms of the digital, uh, threat footprint. But, look, uh, when I was introduced to you by a mutual acquaintance who has frequented the Security Circle podcast, he's in the alumni.
Jennifer:He's by the way
Yoyo:Hmm.
Jennifer:He is a huge fan, by the
Yoyo:way. Yeah, he's awesome. Pretty awesome. And a lot of fun. And I have to say. I didn't realize you were gonna be a woman. I already had a preconception that you were gonna be a guy, and I felt myself being a little bit guilty of thinking that, but I guess that's not something that's strange to you. Is it?
Jennifer:It's not, and I hope it's not a disappointment. Go ahead. No, no, it, of course. I, I lived and operated in a world that was largely male dominated. Certainly when I began, that was the case. And, um, that was just, you know, the nature of the business. There was, um, it's, it, it was a job that drew people who, um, were, I don't know, adventurous, um, bold, uh, independent. Um, resilient and who were able to just pick up at a moment's notice and move halfway across, across the world. And, you know, that came out of a, a history where that job was largely, uh, filled by men, not exclusively. There were a few exceptions along the way. Uh, but when I joined, there were very few men as, uh, undercover operations officers. Of course, that's changed over the years and, um, many, many more now women, uh, in that role. So yeah, it's been great.
Yoyo:And, and I, I, I know you've put a really good kind of tint on that, but there must have been some times where you thought being your gender was both an advantage and a disadvantage or maybe a perception that you were being judged differently because of your gender. Give us an example on, on a case either way.
Jennifer:I think that's absolutely true, and certainly early in my career that was the case. And so that was both because, you know, culturally that's what our country was, was that's where we were in our journey, put it that way. Um, and because just practically it was a job that did not draw many women. And by the way, I was a woman doing work as an American official. Um, certainly informed by American culture and history and values, but. Not of that doesn't really matter when you're in somebody else's country trying to do the job. And so, um, there were moments of, I would say, pretty intense frustration, particularly as a very young woman, um, trying to do this, this job that's impossible some days. Uh, certainly very challenging to do any day. And I didn't fit the model, if you will. There was a, a typical model as we were trained and it, it had a whole lot of, um, kind of. Extroversion and gregariousness and, um, that wasn't me. I, and in addition to being a woman, which was, you know, maybe a little bit of a factor, uh, I, I'm a painful introvert by background and I didn't realize at the time that many of our most senior intelligence officers and operational leaders, oddly enough, are introverts. But there were moments of Ted's frustration where I just both. Um, I, I felt that I was treated differently because I didn't fit, uh, the standard mold. And candidly, there were moments when I really struggled to do the job in the way that I had been trained and expected to do. And there was a hope. You don't mind the, and deep personal reflections here, but there was a moment of, of real, I dunno, personal crisis almost, you know, as the, the French would say, and where I, I had to. Dig deep, and I had to decide, okay, I am going to be successful at this job. There is no choice. I just need to figure it out. And I had to inventory my skills, my background, my competitive advantage, and figure out looking,'cause it's a competitive job like many, and looking at my colleagues who, who were successful and figuring out, okay, what was the lane that I could occupy that nobody else could, where I could bring special skills that would make me successful. And I had to just accept. A fundamental fact, which was, was difficult because everyone wants to be seen and appreciated and valued in their roles. Right? And, and yet, um, in the operational world, when your job is to move unseen, unsuspected, uh, undetected being underestimated in a way was a superpower. And it took me a little bit of, I had to. Compartmentalize that because on the one hand, of course, I'm human and I wanted to be accepted and appreciated and seen. Uh, but on the other hand I thought, okay, wait a second here. I've got this, this set of unique skills and experiences that nobody else has. I can figure out how to use that to get from points A to B to be successful, to drive an outcome. And by the way, if I can do that largely unseen and unsuspected overseas in an environment where people will always underestimate me. Amen, brother. That is on you. And it was a superpower in that sense. So, um, I, I've. I recognize that being underestimated in that field, and, and I won't say that's across the board everywhere. It allows you to observe what others miss. Um, build trust, move through spaces without being noticed. Um, and. Rather than this being, I guess, just all about me. I, I think for anyone else who feels that they're ever in that situation temporarily where you're in a role and maybe you're in a, a, a, a career where women are not traditionally represented, uh, okay. Serious discrimination side. Okay. Fight that in every way you possibly can, but, but if you just have those moments, I, I just think you use that period of being underestimated too. Prepare to observe, to analyze, figure out your competitive advantage, build your alliances, and then you decide when and how you are going to be seen. And, and I think if you look at it as an active process of building something during that brief period, uh, I think that overcomes maybe the, the, I don't know, the psychological piece of it. And it certainly helps you come out the other end much better prepared for success.
Yoyo:Wow. I mean, I really resonate with you because when I was in my earlier career, I used to feel like, uh, I used to, we have to use my elbows to keep my head above the water to be seen. And to be effective and to be good. And sometimes that used to aggravate people. And as I've grown older, I've realized, I described myself to a colleague the other day saying that I feel like I'm a ball boy at Wimbledon. Uh, you know that I'm there observing where the ball is all the time.'cause I don't wanna get hit by it. But I'm watching where the ball's going all the time. Yeah. Yeah. And there's a huge difference. And I started to find in my late forties, I know, I can't believe I'm that old. But I started to find that. I used to find people that never bragged about their intelligence. Very attractive. You know, we all know those people who wanna talk super fast, brag about stuff, this is what I've done. And they think that's a sign of intelligence. But actually I found the pure intelligence with the people who knew stuff but didn't have to say they knew stuff. And then I started thinking, how can I adopt more of that humility so that. And then, and now I sit and I think, I don't feel like I need to brag about what I know and what I do. People find out eventually they, you know, if it's relevant. I always feel that what is for you doesn't go past you. And it's an evolution of, of dealing with the change, isn't it?
Jennifer:It really is. Um. Almost inevitably. And whether it's it's gender or some other, uh, distinguishing factor, um, inevitably people are gonna have this period where they're uncertain. Maybe they're uncomfortable, they're trying to make their name, um, they're trying to build a reputation, become successful, advance in their careers. And, and it's in human nature to feel like you just want people to know that you are. Intelligent that you are productive, that you are successful. And, uh, it's just a process of discovery. Um, and kind of setting aside the, I don't know, the, the, the ego a bit, um, and, and accepting that. That can be shown in lots of different ways and frankly, more effectively because, yeah, the, the last thing I'll say on the, on the topic is that, um, in my case and I, I do believe others find themselves in similar situations. When I was in that early role, um, pursuing a career that was unusual for a woman, uh, and unusual for me based on my personality and my background, I tried to be what I had been trained to be. It's gonna sound strange, but, but I try to be somebody else. And, and the reality is that success is always going to depend in part on your ability to build relationships and trust with other people. And if, if, and people are perceptive, even if they can't put their finger on it, if you are not authentic, they will feel it. And, and you can't have. Trust without being authentic. And, and so it took me a little time to realize that, wait a second here. I just need to find my own way to do the job. Leveraging my own background and skills, acknowledging that I'm different and, and once people can see sincerity and the authenticity, okay, and then productivity and results, then things start coming together.
Yoyo:I asked Nick Gicinto in his interview, you know what it was like having to lie to everybody all the time. And, and, and in fact, I've asked a couple of people, there's another guy, um, Tom Kora, who was also. Very, very sort of influential down in South America working in the intelligence space. And he said something very similar that you almost kind of, so he, they sometimes avoided social functions because they didn't want to just bat off the normal questions or what do you do then? You know? And then I learned, I learned that here in the uk, somebody quite flippantly said, um. The intelligence folk in the uk, the security services had basically said, um, that they worked in it. If anyone asked them what they did, because as soon as normal folk hit it, they're like, oh, okay. Uh, yeah, I dunno what to go with that one. Would you like another drink? Yeah. Would you like another drink? And then, and then when I was on holiday. Uh, a few years ago I met a lovely woman and there was just something about her that she was a solo traveler. I was a solo traveler. And, uh, I said, oh, what, what, what do you do then? And she said, oh, I work in it. I said, oh, really? Me too. Whereabouts in it do you work? And she said, oh, um, yeah, software, um, software production. And I said, oh, really? What kind of like, what sas. Solutioning, you know, that kind of thing. And she just looked at me, she said, oh my God, she's, I used to work for GCHQ. Stop it like this. It was just really funny. Oh, that's pretty funny. She's retired. But I said it was just, I didn't intentionally call her bluff. I'm just a curious person, but I also said you've used that line that I tell people that they always use when they don't wanna be asked any further questions. Also on a funny scale, it works with cyber. If you say you work in cybersecurity, people go, okay. Oh,
Jennifer:except the circles. I, I move in. People are very excited and they wanna know everything. Yeah. You know, that is true. And, and but to, to the, to Nick's comment and, and your other guests, um. In reality, there were very few moments when one is forced to just flat out lie, and the lies aren't about, um, I don't know, uh, matters of the soul or heart. So in my case, it was the occasional necessity to have a different name that, that, that says very little about me. That's not about me. The person, uh, me, the person was. Mostly authentic and sincere, and even when I had to skirt. Okay, well what's your role in the embassy or in whatever company? Um, really usually I found a way to, to say it that was technically not. Not a lie. Um, it just, it just was, uh, a helpful framing of, of the role. Things like, well, I, I follow, you know, political developments, or I follow bilateral relationships, or I follow, uh, multilateral, you know, for, uh, or I follow, you know, international terrorism, these kinds of things. Absolutely true, absolutely true. Um, now for which organization? Different issue, but that rarely came up.
Yoyo:Do you um, know that once I went out, uh, with the girls and I met a guy and when he asked me, this is when I was in the British Police Force and when he asked me what I did for a living rather than tell him I was in the British police force.'cause I didn't really want to tell strangers that I was in the British Police. Uh, so I told him I was a lawyer. And, uh, when he said, what type of lawyer are you? I said, criminal law. Uh, I knew I could talk to criminal law competently, uh, without having my, um, my alias rumbled. And he said, oh, so which side are you? Do you get them off or do you get them on like this? You know? And then that conversation just kind of escalated from there. But, um, yeah, and I think, I think is it true though that in order to be a good liar, you have to tell. Part of the truth, in the sense of, yeah. And why is that?
Jennifer:Um, I, well, for me it was, it's really psychology and human nature, right? Um, you want to remain, you want to be authentic and genuine as much as you possibly can. As I was saying, people can sense when you're not, if, if they sense it, even if they can't put their finger on it, uh, they won't trust you. And the intelligence business is really all about. Trust, it's about two things. Fundamentally, it's about trust, certainly in the human intelligence aspect. Um, trust between an officer and a source trust between the officer and headquarters, thousands of miles away. Trust between CIA headquarters and our policymakers. It's all about trust and risk. And so you have to manage those two things. And if you're weak on either one, then you're not going to be successful. And so I think when whatever, whatever the state of the relationship is between a, an officer and the source, there has to be. Trust, and that has to come from something authentic inside. It can't all be artifice. It can't all be just made up. Hmm. Maybe there's a rare situation when it can be, but there has to be some humanity, some legitimate thing inside of you that they see that they can tap into in order to connect.
Yoyo:Yeah, I can see that. Look, I'm gonna ask you a few sort of, uh, yeah. Techie, techie questions. I wanna go into deep fakery to begin with, um, because this is about deep fakery and the information battle space, which I'm sure that you've seen from a varying range of perspectives. We're entering an era now where what we're seeing is no longer necessarily to be believed. How do you think. Deep fakes and AI driven persuasion will reshape trust in society.
Jennifer:I think that's a fantastic question because it is, I think it's an issue that has really come into its own this year. We've talked about deep fakes for years, and we've seen some pretty rudimentary examples, but this past year in particular, the confluence of, um, technological advancements and distribution channels, um, have come together to make deep fakes far more convincing. And far more readily available. And so we've seen this huge uptick and. I, I had the opportunity to give a speech in Riyadh Saudi Arabia a couple weeks ago, and it was on the dark web marketplace for deep fakes. And, and the reason I mention this is because, um, the, the availability and the, the. Just tidal wave of deep fakes right now is really fueled by this, uh, rapid decline in cost, and then the explosion in distribution channels, largely in corners of the dark web that you and I and others are never going to see. And there are actual marketplaces. Just as you may remember there were for cyber tools years ago, uh, where you've got. Customer service and you have customer reviews and packages for sale and reputations and, and it's all there with pricing and um, uh, all you need is a crypto wallet and an internet connection. So, a long way of saying that, what I suggested in this speech was that there was a time, not long ago, and it's still relatively true, that in cybersecurity our perimeters were our. Endpoint devices, our firewalls, our network edges, our printers, our computers, our phones, et cetera. And, and that's largely true today. But I do think that our perimeter is increasingly trust and trust is under attack in all sorts of different ways. So certainly on the the device side, you can see that playing out with very sophisticated cyber, um, capabilities that are increasingly. Powered by or made more sophisticated by the use of ai. But in deep fakes, that is absolutely the case. You can no longer trust what you see or hear, and this is the big issue because. The, the, the technology's great. These things are widely available. They're not that expensive anymore. And, um, there's a famous case that your listeners may well know about from a year ago, and you can imagine that the technology has advanced pretty dramatically since then. But this famous Hong Kong case where somebody joined, um, a video conference involving senior. Officers of the, of a financial nstitute bank and, uh, or company, excuse me. And, uh, he transferred$25 million. And so this was over a year ago, and it was incredibly sophisticated because you had near real time animation of multiple executives, um, in real, you know, animated convincingly. And it felt real. And so imagine with the advances that we've seen this year, what that looks like. On the dark web marketplace, you can buy these video conference packages for, um, as low as a few hundred dollars a minute, which is not gonna be very good quality to a few tens of thousands of dollars a minute, which are gonna be pretty great. And, um, and so, uh, we're no longer in a place where it's trust but verify. And so what I, what I offered the, the. Crowd there. And I, I won't go into de you know, details, but I think there's a framework that's really helpful. Um, that's, that turns us away from the question of, is this real to the question of what do I do before I act? And I think that's a key shift because Okay, is it real? That's helpful. That's helpful. But, um, here's a, here's a fun fact. Uh, a company I work with, reality defender, um, who do deepfake detection at, at scale enterprise scale in real time. They conducted a deepfake spotting, uh, challenge a while back. Not that long ago, they invited thousands of cybersecurity e uh, experts to participate and they did. Now, the ability to spot the deep fake was only slightly better than tossing a coin. And so think about that. Here. We had expertise and awareness'cause they knew what, what was going on, and it did not equal certainty. So that's why I'm thinking that we have to put in place, um, the right. Framework that moves us away from is this real to what do I do before I act? And so three, three layers. Everything has to have a, a little name. So I called it the 3D stack. So detect, disrupt, and discredit. So detect is where you use the technology to help you. It's not perfect because models can also be manipulated, attacked. They're also, they are under attack. So what works today, you can't assume is gonna work perfectly tomorrow. It's good, it, they're very helpful. Um, so use that and. Feed those results to people who make decisions in your organization. But the second, um, disrupt, I think is, is equally important today. And that is where you build in a bit, not a lot, a bit of friction in those key high risk moments. So moving large sumps of money, releasing sensitive executive communications, or, um, even I would suggest, you know, um, uh, elevating privileges on your systems. And I'm, I think of it almost as. Multifactor authentication for decision making. Oh, yes. And, and u Using analog means where you can to just verify because you can't believe what you see in here. And so we're, we're in this crazy moment where. The technology is rapidly advanced and it's ubiquitous, almost ubiquitous these days, and yet people in the institutions still believe what they see in here. So those two things are colliding right now. So it's detect, disrupt. And then the last one, not to go on at such length, but is, is discredit. And this is where I think a lot of organizations might fall a bit. Um, because they're not ready. Uh, I think the last, last survey I saw only 14% of organizations in the US felt prepared to, um, deal with a deep fake attack of some sort, and only fewer than one in five actually had a playbook for it. So that, as I suggested in the speech, that's, that's an open door for criminals. So, uh, the discredit part is really important because you have to be able to show what is true and, and the thing that people, where I think people might misstep a little bit, it has to be fast. It cannot be, I'm gonna go back through, you know, the legal department and the communications department and. Four, five days, we're gonna have a statement that we put out. It can't be that you have to be prepared, you have to have some kind of a process in place. You have to have stakeholders who understand how this works. And Taiwan has, I think, one of the most effective models. They are, you know, one of, if not the most heavily targeted place on earth for deep fakes and disinformation. And they have their 2, 2, 2 framework where each identified. Falsehood of any significance. Not every single thing, but each one of any significance, they will counter it with, uh, within two hours with no more than two images that show what is true and no more than 200 words. And that speed and clarity, um, and the trust that they've developed over time are really, really valuable. And I think companies, uh, large organizations can learn something from that experience to prepare for the inevitability of deepfake attacks. It is not if anymore, it really is just when.
Yoyo:Wow. And I'm not surprised. Look, you said AI Tech has advanced in a year. I mean, it has because I regularly see on my Instagram feed two talking doggies. Uh, one is a golden, one is a golden retriever, and the other one's a type of boxer of variety, and they're always talking about their owners to a podcast. Mike and I find them absolutely fantastic. Oh yeah,
Jennifer:I would watch that all along.
Yoyo:Yeah. In fact, the other day they were saying it is the retriever that leads the conversation. I think he's the lead host and he's got a great voice and he says, do you know why we get so sad when our owners leave the house and the other one's going, yeah, dude, I get really sad when my owner leaves the house. He goes, well, you get sad because you never know whether they're gonna come back again. It's just awesome how the creator has combined deep. Animal empathy and how we love our pets with this AI technology, which is clearly deep fakey. It's like a real dichotomy. Um, but look, realistically what becomes the new source of truth, right? Yeah,
Jennifer:yeah,
Yoyo:yeah. I see genuine stuff and I see people just going fake, fake, fake. And I'm thinking, ah, you know, that's not great. Not everybody has the ability to be a critical thinker. And that's a bit of a disadvantage certainly when it comes around to election time, but, and listening to politicians. But when so much can be fabricated, what can we do to advise people to sort of look for their own source of truth?
Jennifer:Yeah, that, that's, it's a great question. Um, and I think we're in this, this messy transition phase toward, let's say, more universal approaches to that because, um, this is still early days and it's only gonna become more complicated this entire situation. And so. Certainly we're available. We have to lean into Providence and verification tools. So I think Providence, not vibes, right? But, um, and there are tools out there, watermarking, uh, cryptographic ways of doing that. Uh, certainly in, uh, the law enforcement field is facing this, uh, every single day. Like how do you secure chains of custody? Um, so, so there, there is. They're technical approaches that I think are coming together. It's, it's a little bit slow, it's a lagging, uh, capability. But I think, uh, you know, turning to trusted outlets, some kind of independent fact checking complimented by tech detection tools that tell you whether something is AI generated or not. They're not perfect'cause they're easy to fool. Um, that that's one. Framework. And the second is trusted human networks. Um, but the one thing I've, I've, um, you know, it's been on my mind a lot lately and I've, I've posted a bit about it on, on LinkedIn. So whether we're talking about DeepFakes or disinformation, uh, just training ourselves. I, I'll back up. First is being aware how social media feeds work. They're optimized for engagement, not for enlightenment. And, and that's just a fact. And so engagement as many studies show, um, is greatest when there is some kind of emotional content. And even there was one study and I, I'm gonna forget the reference. Forgive me. It, it's well known that that demonstrated that falsehoods. Travel faster and farther than, than truth in part because they are more likely to have that emotional hook. They're more likely to trigger the human desire to show that I'm in the know and that I know something novel and to show that, you know, I can spread this. And so I think awareness is really important. There's a, a lot of education that needs to be done in that regard. And I've, I've also advocated for, um, kind of digital. I know kind of digital literacy training, uh, for youth in America, uh, combined with a renewed focus on civics education, both of which are under, um, I dunno, are, are, are needed and underrepresented in, in schools. But so awareness first piece, but then the second piece is just training ourselves to watch for those emotional triggers. Does this. Whatever, uh, spark emotion, and it's not just, you know, anger, outrage, disgust, those things, um, are heavily relied upon in disinformation and DeepFakes to get people to, to kind of, to pull them in, if you will. But it's also excessive pride. Um, you know, even, even joy. It, it's emotion. Generally, so training ourselves to just recognize it and then pausing for just one moment before we think about, okay, I need to share this with half the world. Or, um, there's another study again, I'm gonna forget the reference for, for, so forgive me. That demonstrated that a brief, oh, I, I think they called it a micro prompt that said, okay, I have engaged, I've encountered this amazing piece of information that everyone in the world needs to know. And there was a micro prompt that just before I could. Repost or you send it on to hundreds of people. It was a micro prompt that just said. Does this help or harm? And are you confident that it's real? And it was just that small moment where all you had to do is just okay, click and move beyond. But by asking that question and planting it, it actually cut dramatically the further dissemination of what was ultimately disinformation or deep fakes. And so that that is something we can train ourselves to do. And. We are doing it in a way we do have to accomplish this, however, um, in opposition almost to the way that the platforms are optimized. And, and I don't blame them per se, it is business. And uh, that said, we as consumers of that business just need to be a bit smarter about how it plays upon our human emotions and our psychology.
Yoyo:I am quite picky about where I get my data from, uh, certainly on the internet. But something really interesting came up, and I'm not gonna be able to beautifully articulately say this'cause I read it, um, a little while ago and I've drunk some red wine since. Um, but ultimately it was about meta realizing it's publicly known, realizing that the algorithms designed to keep folks' attention are the algorithms that contribute to business growth. And they've understood now the correlation that they can't have business growth without the algorithm that keeps, right? Yes. And they know this and they've decided to disregard it. Mark Zuckerberg was quoted to say something along the lines of, don't bring this to me again. It's not gonna change our business objectives, in the way that we want it to.
Jennifer:There was a book about a decade ago, that actually outlined how to design digital products with, this emotional, this, this hook to draw people in and keep them. And, and it was written. Not from a place of irony or satire, but really how if, if you are a software designer, how do you do this? And today, of course we look at it very differently because we've started to see the the negative impacts. Social media has done a lot of wonderful things. It's connected. A lot of people helped us. You know, share information. That's all wonderful, but it's also had a number of very harmful effects on society. And we're coming, I think, slowly to the realization that we need to have a different sort of relationship with, with digital check generally and with online platforms specifically if we want to continue to receive the benefits from this amazing technology while protecting our kinda psychological state. I'm I, I. I talk a lot these days, or I think a lot about something I'm, I'm calling mind sovereignty, and this is perhaps not the best way to think of it, but the ability to think for yourself, think independently, reach your own conclusions in a world that's designed to think for you and, and that's really what is happening when you have. Adversaries around the world who are trying to weaponize our information ecosystem against us in order to drive fissures in society, in order to, um, just tap into all the, the stuff that's out there. Okay? That's one side. The other side is however, what, what we welcome every day with open arms in our engagements with commercial tech, uh, algorithms that are optimized again for engagement, enlightenment, and. And all sorts of things flow from that. So, uh, I think it's intuitive that we tend to find ourselves in increasingly in, in echo chambers, not completely, but more or less in echo chambers, fueled by, um, the algorithmic, uh, amplification of things we believe and we tend to. Cluster with people who share our worldview. And, um, there was another really interesting paper that talked about this was a, a psychologist who put out recently, um, that talked about that information environment and, and staying in that environment for extended periods of time. You start to interpret. Project onto the other, the people who disagree with you, you start to project onto them that they feel hostile towards you and that, uh, you actually lose interest in learning what that other side believes. And with it comes the collapse of something he calls civic empathy, which intuitively made a lot of sense to me because you do see. This lack of empathy, certainly in the US, across divides, whatever those divides might be. And, um, we need that back. Uh, it's really, maybe this one isn't rooted in science, but it's rooted in decades of. I don't know, studying human behavior, empathy exists where you can actually connect across differences and, and so that's something that's under attack, both intentionally by adversaries, but also unintentionally through the way we engage with, uh, digital media.
Yoyo:So I'm gonna go off script a little bit, but there are five generations of warfare that I've discussed quite openly, and I've got a specialist coming in to talk about this. At some point. In fact, we were hoping to talk today, we talked about the five generations. Most you got stuck with me instead. Yoyo, do you know it's by no means a bad thing. Um, the first one we know is Napoleonic. The second one we know is World War I. The third one we know is World War ii. The fourth one is the generation of terror, and we are in now the fifth one, which is the fifth generation of warfare, and it's the frontline is the mind of every citizen. Mind occupied territory, and I certainly have talked about it and I've promoted the, talking about mind sovereignty on the podcast. This is gonna be a key thing for the next year. Yeah, and the reason I wanna put this into perspective is because I think businesses know when they submitted their technology roadmaps for, you know, 20, 25, 5 years ago, not one of them would've had AI in it. Four years ago, three years ago, two years ago, possibly just one year ago. So what's coming up in terms of how mind sovereignty affects risk overall? We're not just talking about the risk to the individual, the risk to family, the risk to a community, the risk to a country sovereignty, uh, because we can see how. Certain presidents become undermined by losing sovereignty through campaigns that are externally driven. We know that happened to the UK with, uh, Russia and Brexit. But what about the business, the corporation where the individuals are losing their mind sovereignty? It's a big subject and it's a big question. Where do you wanna start with that?
Jennifer:Oh, goodness.
Yoyo:I, how to tackle
Jennifer:that? I, um, so. Maybe this will be a, a, an unexpected way to frame it. It is a really tough challenge for an organization and, and I think, uh, I think of it in two layers. First, because a, an organization, a business, whatever your organization is, so government agency department, a, a large business, a small business, um, you can't control what everyone in your organization does or thinks, and you shouldn't try. Uh, it's not. Some someone else's job to tell me what to, to think, or even how to think. But what I do think is that in organizations where trust and transparency are, are strong. You can have conversations about the culture that will maintain success. It will maintain, um, a, oh, let's say a, a, a rich information ecosystem that is more likely than not to be reliable. And, and that then flow what flows from that is an individual responsibility to be smart consumers about that information. And though you can have those conversations, if you do have an organization where you, again, you have trust and transparency and good communication, you can have conversations about the challenges we face as individuals today, and you can promote ideas. Like, um, how to best engage with information online so that you can spot falsehoods or at a minimum, stop the spread of them. And, and I think a number of things, uh, I've already mentioned, um, some, let's say techniques or a mental model one might want to use. Then the one I think I'm, I left out was intentionally diversifying one's information diet. And that's, a lot of people aren't willing to do that. I, I view it as a bit of a game. Uh, for me, I, I gamify a lot of things, so my job is to completely confound the recommendation algorithms and so that they have no idea what to feed me next. And, and so. But what I find in that process is that you do, you do find, you do locate individuals or platforms that with, which you may not agree politically or culturally or whatever it is, but you can actually. Appreciate that they are thoughtful and, and I think we need to get back to that as sort of a, a, a default, if you will. And, and so again, organizationally is kind of trust and transparency, but then what flows from that is a collective responsibility for each individual to contribute to the organization's success in this crazy environment by taking personal responsibility for how they engage with digital technology, particularly in the information environment.
Yoyo:Yeah. I just, I just don't see it happening though, in a sense. Yeah. Um, because I think when we look at human nature and I tend to, and it's sometimes it's a weakness and a strength, but I tend to look at everything from a helicopter view. Yeah. And I think I see society. Systemically always choosing the the path of least resistance from everything from a physical security example where everyone cuts across the pathway and you've got this, you know, muddy trail in the grass. It's a wonderful example, down to, you know, just be people becoming lazy and unhealthy because it's an easier option than, working hard. And keeping healthy. And there's so many parts of the human psychology that are involved in this and I don't know that the human race is being designed to be more resilient to the challenges that we have coming.
Jennifer:You're absolutely right. And I hate to think of it this way, but, um, dramatic change where one accepts inconvenience and friction generally is. Precipitated by some dramatic catastrophic event. So think about, um, uh, I'm not saying this will come, but prior to nine 11, uh, there was very little acceptance of inconvenience and friction. Um, at airports following nine 11, we all embraced the need to have. Multiple layers of controls and security and, and so is there going to be some sort of a nine 11 in the information space that gets us to more broadly accept that we have to be. More discerning and we have to actually apply critical thinking. I hope we don't get to that. I hope we can actually promote critical thinking and responsible behavior online, um, without a catastrophic event. Uh, but if, if there is one, then people will want to change.
Yoyo:So I've got a big question for you now. I mean, you're in the security circle. Only big questions here, Jennifer and I. I think you're up for it. But this is nicely tying into whether you think the biggest threats to national security are external. Or are they internal vulnerabilities within our own technological and societal structures. The baseline equivalent to that would be, am I more scared of the house burglar or because I haven't put enough burglary prevention on my home? Hmm.
Jennifer:It's, it's a terrific framing of the issue. Yo-yo. And, uh. I, I have no other response except to say that it is both, but external threats, how, how to put this, so the success of those external actors is in part amplified by the internal issues that we have. So the fact that we have. Polarized society, that our eco digital ecosystem, our information ecosystem, rewards, engagement and emotion and division over, um, you know, enlightenment and, uh, community. The fact that we have some institutions that have become more fragile in the face of stresses in, in society. So. We have all of that. And I think a lot of those issues are, as I, I alluded to earlier, at least contributed to by the way that our information ecosystem is designed. So we've got that, but external actors, hostile adversaries in particular. Much of their success depends on, or it takes advantage of the fact that we have these internal vulnerabilities in our both technological and our societal structures. So it, it is both. And fundamentally, an adversary's job is a whole lot easier when society is brittle and it's brittle.
Yoyo:Uh, unfortunately. So now I've got one last question for you. Oh, goodness. I'm gonna answer it myself first by saying that if I, that'll gimme some
Jennifer:ideas. I know,
Yoyo:right? But can you beat this one? If. I could redesign one aspect of the digital world to strengthen human resilience. I would use the, what I call the Robocop principle, that human life should be protected at all costs. And, and I know this is very difficult because in the military. That objective or that primary directive is completely disregarded, which does tell me that society in general does have a disregard for human life. I believe in the security by default principle. If only we were really thinking in that way then. We know that AI wouldn't be an existential threat to human existence, even though we know it has its limitations at the moment. We could see that it has limitations, but we can't see five years down the line with ai, because it's literally evolving every six months. What would you do to redesign one aspect of the digital world to strengthen human resilience?
Jennifer:Uh, it's a good question. Um, I'm, I'm. Very cautious about how we implant our ethical frameworks on ai. Um, because one only has to look back to accept the fact that we are not a. You know, a being that has reached peak perfection and we do not have the answer to everything and things that humans. Believed a hundred years ago appall us today, and things that we believe today will likely appall humans a century from now. And so hard wiring our ethics and values is a really treacherous thing right now. If we're thinking about the future of humanity and existence writ large. I think that the one thing I would redesign if I could, maybe it hits a little bit closer to. Today's situation, I would love, and this is not possible, but maybe there's somebody out there who could figure out how to do this balancing privacy and, and uh, technology. But I would love for each person to have a kind of a transparent d dashboard of your. Attention. So not, there are little bits and pieces of that you see on devices. You have, you know, time spent on platforms and that sort of thing. But, but something that goes deeper that shares with you how much of your attention is spent on synthetic content. Emotional content designed to trigger a habit loop, um, and maybe a map of the information ecosystem that you visit, um, but with clear representation. Of what else is out there beyond your terrain and, and so kind of a graphic graphical representation in a sense of. Echo chambers, algorithmic amplification, and where you choose to focus your attention and time with maybe bonus, uh, an optional little button to push where you could prompt to expand your information diet, to create, um, weaker echo chambers and to share more of what else is happening out there and other viewpoints. And I think. I think that would do a lot to rebuild empathy. Um, going back to one of my previous comments, um, it also would do a lot to, um. I don't know, prompt, more original thinking. Uh, my, my, my own experience, and I, I think it's borne out by, by psych psychological studies is that, uh, new insights really do more often come through the collision of ideas than they do through just amplification of what you already believe. And so. Helping people make a choice to engage with the information world that way, I think would be super helpful.
Yoyo:Maybe it begins with the youth.
Jennifer:Well, you know, I have, I have a lot of optimism, um, and at least from my limited sample set, I think we have a generation that's, um, rightly skeptical with technology, but not really cynical, if that makes sense. Um. I, and we see that there are more and more conversations about what rational guardrails look like, um, for AI for. Digital identities for authenticity. Um, we're in that messy early stage as I suggested ear, uh, earlier. So, but, but we're having those conversations. The only thing I would, I would say, um, in order to maintain my optimism is people should recognize that ambivalence to all of this is the enemy at the moment. The path of least resistance is just to kind of go along and, um, as I've suggested in couple, just. Talks I've given, uh, we, we, we hand over the curation of our reality to algorithms for convenience, entertainment, and dopamine hits of digital validation. And so I think ambivalence is the enemy in all of this. Independent thought is no longer given. You have to fight for it.
Yoyo:What a great lesson. Jennifer, we have touched the tip of the iceberg in terms of what you can offer. Let's definitely find a way to work together. And also I would love to have you back, to talk about I would love both. Yeah. So thank you so much for joining us on the Security Circle podcast.
Jennifer:Thank you, yoyo. It's such a pleasure.