AI or Not
Welcome to "AI or Not," the podcast where digital transformation meets real-world wisdom, hosted by Pamela Isom. With over 25 years of guiding the top echelons of corporate, public and private sectors through the ever-evolving digital landscape, Pamela, CEO and Founder of IsAdvice & Consulting LLC, is your expert navigator in the exploration of artificial intelligence, innovation, cyber, data, and ethical decision-making. This show demystifies the complexities of AI, digital disruption, and emerging technologies, focusing on their impact on business strategies, governance, product innovations, and societal well-being. Whether you're a professional seeking to leverage AI for sustainable growth, a leader aiming to navigate the digital terrain ethically, or an innovator looking to make a meaningful impact, "AI or Not" offers a unique blend of insights, experiences, and discussions that illuminate the path forward in the digital age. Join us as we delve into the world where technology meets humanity, with Pamela Isom leading the conversation.
AI or Not
E053 - AI or Not - Dr. David Bray and Pamela Isom
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Welcome to "AI or Not," the podcast where we explore the intersection of digital transformation and real-world wisdom, hosted by the accomplished Pamela Isom. With over 25 years of experience guiding leaders in corporate, public, and private sectors, Pamela, the CEO and Founder of IsAdvice & Consulting LLC, is a veteran in successfully navigating the complex realms of artificial intelligence, innovation, cyber issues, governance, data management, and ethical decision-making.
Crises don’t wait for perfect plans. We sit down with Dr. David Bray—previously IT Chief for the Bioterrorism Preparedness and Response Program, a Senior National Intelligence Service Executive, FCC CIO, Executive Director for two bipartiasn Commissions on tech and geopolitics, and long-time public servant—to explore how leaders make clear, ethical decisions when information is incomplete, politics are hot, and systems are already under stress. From modeling anthrax response and spotting SARS via garlic prices and hospital parking lots to steering the FCC through a bot-fueled flood of 23 million public comments, David unpacks the tactics that keep organizations learning while they act.
We dig into gray zone conflict and why free societies must incubate technology with intention. Hidden chips, untrusted hardware, and disinformation aren’t sci-fi threats; they’re daily realities that call for independent verification, resilient supply chains, and cross-functional playbooks. David argues the real AI risk isn’t a rogue superintelligence but the erosion of the shared commons—those social spaces and norms that let diverse people reason together. He shows how to counter that drift: break silos, invite dissent backed by data, and create structures where rival perspectives do their tug-of-war out loud.
This conversation offers a practical decision toolkit for the AI era. Build a personal board of advisors across humans and machines. Ask for sources, climb to the balcony for a systems view, measure decision elasticity, and plan pivots before you need them. We also talk about the difference between managing and leading, how to process loss without losing momentum, and why moral courage—stating what’s true when it’s inconvenient—still matters. Guided by Rawls’s veil of ignorance and a bias for service over spectacle, David leaves us with a clear charge: protect the commons, choose curiosity over certainty, and be the change where you stand.
If this resonated, subscribe, share with a colleague who leads through uncertainty, and leave a review telling us your favorite takeaway. What’s your pivot plan when clarity lags?
[00:00] Pamela Isom: This podcast is for informational purposes only.
[00:26] Personal views and opinions expressed by our podcast guests are their own and not legal advice,
[00:34] neither health, tax, nor professional nor official statements by their organizations.
[00:42] Guest views may not be those of the host.
[00:50] Hello and welcome to AI or not, the podcast where business leaders from around the globe share wisdom and insights that are needed right now to address issues and guide success in your artificial intelligence and your digital transformation journey.
[01:05] I am Pamela Isom. I am your podcast host and we have a really special guest with us today, Dr. David Bray.
[01:14] He's a former CIO with FCC.
[01:17] He's a leader, a partner.
[01:19] He says something about guiding success amid turbulence in tech, data, space, biotech, AI and people.
[01:27] So we're going to be talking about that today.
[01:29] David, welcome to AI Or Not.
[01:32] Dr. David Bray: Great to be here with you, Pamela and I look forward to our conversation today.
[01:35] Pamela Isom: All righty then. So let's jump right in. Will you tell me more about yourself,
[01:41] your career journey?
[01:43] And I'm really curious about if there were moments where you chose the harder or less traditional path.
[01:54] Dr. David Bray: Well, thank you for that. And I guess the short version I sometimes tell people is I fell in my head at an early age and it made all the difference. I guess the slightly longer version was I was crazy enough to start working for the US government when I was 15.
[02:08] People automatically assumed that I must have done some cyber hacking event and gotten in trouble. But no, actually it was through science fairs. My father was a Methodist minister, my mom was a schoolteacher, and my teenage rebellion was to try and figure out who my dad worked for.
[02:22] So I created computer sim of different natural phenomenon. And this was the early 90s. And so one of them was ozone layer deterioration.
[02:30] And so that science fair project turns out the US Government uses science fairs to recruit. Initially it was actually in 1993 through the Department of Energy. They offered me a job because they said, hey, you can do computer simulations.
[02:41] We've got a job for you. And it was they had me modeling electrons and neutron beams that were electrons were throwing against the walls to see what would happen.
[02:49] And next thing I know, in 1995,
[02:52] I actually got an offer to work on some classified efforts. I was only 17 at the time with the what turned out to be the Ballistic Missiles Defense Organization and the Air Force.
[03:01] Went off to college in Emory University in Atlanta. Swore I was going to do something other than computer science, did evolutionary biology, but worked my way back to computer science because I couldn't get away from it.
[03:10] Built a model of HIV AIDS in South Africa and then found myself actually on the ground in South Africa. So that may be an example of picking the harder path because 1998 apartheid had ended, which was a good thing.
[03:24] And at the same time they had gone from having 5,000 cases to 2.4 million cases out of 40 million people of HIV AIDS.
[03:31] And so I was trying to call attention to it. I was being told, write about politics, write about football. And I was like, no, you've got an epidemic here. And ended up quitting the paper and instead teaching about HIV AIDS in the local school system.
[03:43] Came back, told my professor at the time in college that I wanted to take a year off.
[03:48] Worked with Microsoft, basically doing Fortune 500 IT integrations, early days of the Internet and intranet efforts,
[03:55] and volunteering with Habitat for Humanity as a crew leader.
[03:58] Came back to finish my degree,
[03:59] got approached by another program from the government called the Bioterrorism Preparedness Response Program.
[04:05] We were only 30 people at the time, this was 2000.
[04:08] Congress said we didn't need to exist, we were a Cold War relic. And we said there's reasons. And I was an early adopter of agile development as an approach. So iterative development versus waterfall.
[04:19] But I was getting in trouble because they said, you're not spelling out all your requirements up front. And I was like, yes, you're not following the five year enterprise architecture.
[04:25] I said yes.
[04:27] I actually sent a memo in June of 2001 saying we do not have a deal with bad actors or Mother Nature not to strike until we have our IT systems online.
[04:36] And it was actually scheduled weeks in advance for me to give a briefing to the CIA and the FBI as to what we would do technology wise, should a Biotin event happen.
[04:44] Unfortunately, that briefing was scheduled for September 11, 2001 at 9 o' clock in the morning.
[04:48] 8:34 as we know, sadly, tragically, the world changed. We had the 911 events.
[04:52] We responded to the events of 9 11. Stood down from Heili October 1st.
[04:56] I ended up briefing a CIA on October 3rd.
[04:59] Only had the first case of Amstrax show up in Florida on October 4th, followed by the Capitol Hill threat letters and the New York threat letters and everything else October 6th onwards.
[05:07] And had we not done agile development,
[05:09] we would have had to handle 3 million environmental samples and 300,000 clinical samples by fax as opposed to electronically. So I often tell people, how do you know who's your heretic?
[05:19] And who do you know is actually trying to do their best to actually get the organization moving forward. But that was the start of My career, and it's. It's been eventful ever since.
[05:26] But I think I always pick the harder roads,
[05:29] partly because they're the roads that you need to travel.
[05:32] Pamela Isom: That's just fascinating. I knew some of that, but I didn't know all of it. That is just fascinating. So tell me about this anthrax.
[05:38] Dr. David Bray: This was. We were only about 30, 40 people. We had a thing called a laboratory response network for bioterrorism. This was 2000, 2001,
[05:46] in which several people. I'm going to give a shout out to a pioneer at the time, Richard Kellogg and others had basically developed interoperable lab protocols because there were different lab sites.
[05:57] Some of them were Department of Defense labs, some of them were FDA or CDC labs,
[06:02] some of them were USDA labs. And we hadn't ever agreed to a protocol. If someone brought a sample that they thought was bioterrorism in nature, how do you test it?
[06:11] What does a presumptive positive mean? What does a confirmatory positive mean?
[06:15] And so we had standardized that and actually what that meant in the testing protocols, but then also in the epidemiology side, I can remember even before the events of 9, 11,
[06:24] asking epidemiologists, so what are we going to do in terms of data collection if, say, an anthrax event happens? And what you realize is,
[06:32] and I love medical doctors,
[06:34] they don't really have a plan until they see the patient.
[06:36] And so they're like, we'll deal with it when we see it. And it's like, it's really hard to build an IT system if you don't know at least some playbook.
[06:45] And so I had to balance fluidity and flexibility with the fact that in reality is a lot of this is the unknown. You don't know if it's going to come from a human source or a animal source, or if it's going to be intentional or accidental later.
[06:58] We actually had to deal with in 2003,
[07:01] we actually knew it about five and a half months before the original coronavirus SARS was on the world stage. And one of two of the ways that I can share that we knew about it.
[07:10] One was, and this was five and a half months before March of 2003. So it was late 2002,
[07:15] the price of garlic had gone up tenfold. And in Southeast Asia, garlic's seen as a cure all. So if you see a price spike in garlic, that can tell you maybe something's happening where people want a cure.
[07:26] But then two,
[07:27] not tracking individual people, but just seeing that there was Reflections of more cars in parking lots around hospitals in certain parts of China,
[07:34] less cars in parking lots around factories was enough to make us say, is something going on here? We asked the State Department to ask the Chinese government. And for China,
[07:42] public health and national security are the same. So they of course said no. And it was not. It was about a week before St. Patrick's Day in 2003. I was ready to get home leave at about 6pm and I got told no instead.
[07:54] Vietnam has raised a flag and the rest is history. And so that was called atypical febrile illness. It later was called severe acute respiratory syndrome or sars. Fortunately, it did not become a pandemic like Covid did.
[08:05] But if you Fast forward to COVID 19,
[08:08] the same patterns happened. Price spike in garlic. The only difference was with the satellite footage. It was now available through commercial satellites versus government satellites.
[08:18] Pamela Isom: Ah, that is so. Well, I'm very thankful for you and the work that you've done. Congratulations on your success.
[08:24] Dr. David Bray: Oh, I don't know if it's success,
[08:25] but I appreciate that. I would just say, I often people say, you know, what do I recommend if, if they want to be of service. And I say go to a place where your talents can be of use and I guarantee you events will happen where hopefully your talents can be of use.
[08:39] Pamela Isom: Okay, so I think that that was a good example of the harder and the less traditional path. Some of it voluntarily, some of it not. But that's what right in this world that we live in and we have to be prepared and embrace it.
[08:51] So I appreciate it.
[08:52] You often talk about incubating new technology efforts.
[08:57] We've had some discussions about that. What does that mean in practice? And why is this mission so important to you right now?
[09:05] Dr. David Bray: So after about four or five years of responding to both human caused bioterrorism and also natural caused public health events, my boss at the time said you can either wait 20 years,
[09:19] get a rank of Admiral General, get a medical degree, or get a PhD. And I decided to get a PhD. I thought that was easiest.
[09:25] And I ended up going to a business school focusing on how to improve organizational response to disruption with both technology and people, because I feel like you have to do both.
[09:35] And after doing that, I raised my hand. I did postdocs at MIT in Harvard and does what everyone does after you get a PhD in postdocs, I raised my hand to go to Afghanistan because if you remember, they kept in place Secretary of Defense gates from President Bush 43 to President Obama as at the time the SecDef.
[09:51] And so I was a nonpartisan who was reporting to both the Office of the Secretary of Defense for Policy as well as the Executive office of the President, which meant I was going to upset somebody.
[10:00] And about 30 days in, I gave my first sort of briefing back and I said, I'm not sure why we're still here in the sense that we said we're leaving in one or two years from Afghanistan.
[10:11] And on the other hand, we say we're here to bring democracy to Afghanistan, which I would submit is a 40 or 50 year endeavor,
[10:17] partly because literacy is only 20%, and that's if you're a male.
[10:20] We've got to build up education, we've got to build up literacy if we really want to do democracy. But I would submit that probably what we want to do is have uniformed troops leave Afghanistan and instead, because it's not really a nation at the moment, have special forces go to the 13 different tribes offering them aid on an annual basis per the Pashtun code,
[10:39] as long as they promise not to harm us, the west, our tribe. And if you don't want to do that,
[10:44] just invite the United nations to play a peacekeeping role with possibly India and or China making a bulk of the forces.
[10:50] Well, unfortunately, that didn't happen for, for reasons. But again, we picked a harder road, not because it's easy or, or it's going to get us glory. We just do it because you need to.
[10:59] And when I came back,
[11:01] I would say in the last 15 years,
[11:04] unfortunately, I've seen the US become more like Afghanistan than Afghanistan become more like the US and what do I mean by that?
[11:11] As a result of the Internet,
[11:13] we've become more divided and tribal. As a result of the Internet,
[11:16] small groups of people or even individuals can cause massive amounts of harm.
[11:21] And so this has led to what several, including myself, call gray zone conflict,
[11:27] where it's things that are not exactly tossing out civil norms, tossing out ways that we work together.
[11:34] And I often feel like I need to help incubate tech efforts as a way to show how free societies can do things better.
[11:46] Because I think if you don't incubate the tech effort right,
[11:49] that same technology can be used against us.
[11:52] In fact, a lot of it is. And I often say that technology itself is amoral. It's how we choose how to use it. This glass in my hand right here,
[12:00] you know, it can help me drink water, it can provide when someone's thirsty, it can provide needed liquid. But it can also, unfortunately, be broken and turned into A weapon.
[12:08] And we're not saying we shouldn't have glasses in the world or we should somehow do its restrictive regulations,
[12:14] but it really is. It's not enough to have the tech.
[12:17] It's to help businesses and help communities and ultimately societies think through how do you use the tech in a way that's uplifting to people while also being sustainable from a business model and sustainable from a society model at the same time.
[12:31] Pamela Isom: Is that what you call the gray zone conflict?
[12:34] Dr. David Bray: Yes. So gray zone, gray zone is, is, is more war by another name or competition by another name. So as an example, it could be.
[12:42] Unfortunately,
[12:44] there have been events in the last five years where,
[12:48] for example, modems from a large country of 1.3 billion people have been found in ports, modems have been found in batteries for solar inverters and things like that.
[12:58] I work with a company that has the ability to do independent verification that the hardware really is what the hardware claims is and nothing else.
[13:06] And sure enough, we have found in some very sensitive environments, including underwater nuclear environments operated by the US Government,
[13:13] routers that were us labeled but had additional chips on them that were not detectable by other means that we're trying to call home.
[13:22] And so it's cyber means it's sometimes what people would say, disinformation or just poor quality information. It's creating wedge issues in society.
[13:30] And so it's a question of, well, what's a response when that happens? Because it's not really war,
[13:35] but it's messing with your people, messing with how your society operates in a way that's making it harder to hold folks together. And I often say, and given your podcast, is AI or not,
[13:47] to me, the biggest risk with AI is not that it's going to take over the world, because I give that a very low probability, at least in the next 10 to 15 years.
[13:56] I don't think that's its aspirations. And if it does, guess what? Electronic magnetic pulse is really good at taking out electronics.
[14:03] But the bigger risk is that AI will erode the commons that holds us together as societies.
[14:11] And that means the things that we still all share.
[14:14] And that's been a journey that's been happening even before AI showed up. Unfortunately, the Internet, while it's really good, it helps us connect.
[14:20] It also allows us to find ourselves in niche tribes and communities that think like we do. And as a result, we convince ourselves that we have the monopoly on truth,
[14:32] when in fact, the reality is there's a lot to be learned from folks different from yourself, including people you disagree with but just hearing them out, you might actually discover there's things worth learning together.
[14:43] Pamela Isom: It's interesting because I was talking to someone not too long ago and we were,
[14:47] we were bringing back how silos back from my industry days before government.
[14:53] Of course I'm back in industry but back then as I was coming up through the ranks,
[14:58] the innovation came for me when I would do things to try to break down those silos.
[15:05] So,
[15:06] and it's a natural thing to,
[15:08] as you pointed out here,
[15:10] use the Internet to get to the information that you need for your purpose and to solve your problem.
[15:18] It's an opportunity for innovation to look at how we can make it so that it is more cross cutting and more beneficial to some versus 100%. Just take time to think about that.
[15:35] And that's what we were talking about before and what are some of those secrets for innovation? And it's like look for opportunities like that.
[15:44] And so you said that like some of the challenges that you saw with incubating a new technology. So I, I heard that and what you were saying. So I couldn't help but, but oh
[15:53] Dr. David Bray: no, I, I, I think you're spot on and I think it's so needed because oftentimes I talk to people and they say I don't have time to hear from folks that are challenging my views.
[16:06] And I'm like, but you do. You, you have to make time because otherwise you, you run the risk. Especially if you are a company or, or if you're in government in any, in any circumstance.
[16:16] If you only surround yourself with,
[16:18] whether it be digital devices and AIs or people that are telling you things that already reinforce what you already know,
[16:26] then you're not growing as a person, you're not learning. And I guarantee you and this is what my, this is what I saw when I did Biocherism Paris is what, what made things fail, sadly was when you had people that were not asking what am I missing?
[16:40] What is incomplete in here? How has the world changed? I thought this was the case. It may no longer be the case. How do I still know what I hold to be true is correct?
[16:49] And there it was actually a risk of because we actually,
[16:52] our small team, when the anthrax events happened, we actually,
[16:56] we were part of a 15,000 person organization. So we were sort of lost in the noise. But we said if you discover that this is coming from the postal system because remember initially with the threats in Florida, it was just a sick person, there was no threat letter found initially,
[17:09] then you need to start doing Prophylaxis. Only then did they find the threat letters. But by that time,
[17:14] the people in charge beyond our group had called in their close friends, but they hadn't practiced, unfortunately. You had law enforcement going on camera claiming the anthrax virus when we know it's a bacteria, and we're like, no.
[17:26] And then you also had public health people forgetting that you have to preserve chain of custody of evidence.
[17:31] And so by not bringing together these different perspectives and recognizing it's both a crime scene and a public health emergency event at the same time,
[17:39] unfortunately, bad things happened. And so I am fiercely motivated to.
[17:43] Even if I don't agree with what I'm hearing, I want to hear it,
[17:48] because if anything, it'll help me inform what I might be missing.
[17:51] Pamela Isom: Don't you think that's a characteristic of strong leadership? Which leads me to my next discussion point. So leadership in this era, right? This AI, this rapid pace, this turbulent era,
[18:05] from your perspective,
[18:06] considering government,
[18:09] industry,
[18:10] global institutions, and just your experiences at large,
[18:14] what are some of those defining elements of great leaders?
[18:20] Dr. David Bray: So I think the first thing I try to remind people is recognize that leadership is when you step outside of expectations and manage the friction associated with it.
[18:31] And usually when you step out of that side of expectations, you're going to irk at least one or more groups of people.
[18:37] And so often when you're in the moment of leading, you're not getting accolades. If you're getting accolades, you're doing a great job managing, not leading.
[18:46] And that's okay. We all need to have some management.
[18:49] If we don't manage expectations to a degree, then we're quickly out of a job.
[18:53] Whether it's what our boss wants, our peers want, our reports want, the Congress wants, our shareholders want, whatever.
[19:00] But leading is when you say, look, I hear your expectations, but I'm going to step outside of them because I think this is actually a better path. This will take us to a better place.
[19:09] And you got to manage the friction because people are going to say, no, get back in your box, or, this is not what I wanted you to do. This is not why you hired.
[19:17] You were hired.
[19:18] And you have to have a sense as to how much change you can introduce.
[19:24] I often say there's also the word leadership comes to the Greek word lite, which means to be sent unto death.
[19:29] You're really helping them pay respects to the death of an idea or a death of a way of working,
[19:35] or even a death of an identity. I mean, you and I were talking a little bit earlier about what happened in 2025. I think great leaders are identified by how they help people process loss.
[19:44] It might not be that I can help you avoid a loss,
[19:47] but I can help you process it.
[19:49] And so to. To your point, yes, I also think great leaders are willing,
[19:54] and I think best about how President Lincoln surrounded himself with a cabinet of rivals because he actually wanted to hear the different perspectives, and he was willing to hold all the different perspectives in his head and still lead in that type of environment.
[20:10] Pamela Isom: Okay. So I heard that you have to know how to manage friction,
[20:16] which I agree.
[20:17] You said something interesting about managing.
[20:21] There's a difference between managing and leading, and you really clarified that, which I believe is good for those. There are individuals and listeners that I speak with that are looking to get into executive leadership,
[20:38] and they're trying to figure out what can I do to differentiate myself.
[20:42] And so I provide coaching in that space.
[20:45] And one area that I'm going to share with them is what you brought out, which is managing you could manage well, but that's not necessarily leading, because that's like trying to get the check boxes straight.
[20:59] Right. So you want to meet the checkboxes. But leading is so much more than that.
[21:04] I think that that's really good insights, and I think that it exemplifies your trajectory. Right? It exemplifies your experiences.
[21:10] And I tend to agree with that.
[21:13] I can remember being told to get back. Get back.
[21:16] Dr. David Bray: Yes, exactly. That's when you know you're leading. When someone's saying that, you know you're leading.
[21:21] Pamela Isom: This is my area.
[21:22] And so the thing that I would add to what you said is influence, because now you've got to know how to influence and still get results in the middle of all of that.
[21:33] So. And that's you, right? That's kind of what you do. So you have the influence in order to still get done what you want to get done.
[21:40] So that's very important.
[21:42] So I think those are great pillars. Right, of leadership.
[21:46] And so considering all of that, is there something within there that you see as hidden strengths that maybe we don't utilize enough or that is underestimated that we can really exemplify more of?
[22:02] Dr. David Bray: So,
[22:03] interestingly,
[22:05] I've been actually reflecting on this, given the holidays and the start of the new year.
[22:10] So one of the things that motivated me to go to business school to get my PhD and then later motivated me to go to Afghanistan was I felt like we were structuring how we tackled problems wrong.
[22:24] And I wanted to prove that. And so at the time, my Theory was if you make people feel more like it's actually part of a community, even if they're coming from different government agencies or different countries, even if you remove labels we associate to information,
[22:43] for example, you know, we know, sadly, unfortunately, if you label something as classified, people are going to treat it as if it's somehow more important than if it was obtained through open source or unclassified means, even if the unclassified information is actually better.
[22:55] And so that's a false label that. That can make people count something as being more useful, when in fact it might not be.
[23:02] And so I went ahead and showed that. But then to put this in a more practical context, after I got back from Afghanistan,
[23:08] became a senior National Intelligence Service executive, and along the way got asked by the principal Deputy Director of National Intelligence would I like to lead a bipartisan congressional commission that was going to review the research and development efforts of the U.S.
[23:20] intelligence community. All the efforts,
[23:22] and it was six Democrats, six Republicans, including two senators from both sides of the aisle, two representatives from both sides of the aisle,
[23:30] and I said yes. And as I was leaving her office, she also revealed that there had been three senior executives in three months not work out. I was number four.
[23:39] So.
[23:39] So that also embodies what I. What we were just talking about in terms of leadership is sometimes being sent unto death. I was being sent unto death.
[23:46] Pamela Isom: I bet you just welcomed that, right?
[23:48] Dr. David Bray: Oh, I was like, bring it. Yes, exactly. And the good news is, a year and a half later, no one else got killed,
[23:53] either metaphorically or figuratively. And. But the way I structured it was I intentionally got at first the Republican code share and the Democratic code share together in a room at the same time.
[24:06] Because I know if I met with them individually, guess what?
[24:09] You know, one co chair is going to tell me to go this way, the other co chair is going to tell me to go this way. And guess what? I'm never going to make them happy.
[24:15] But if I have them in the room at the same time and I say, so how do we want to move forward then? That way, they're doing the tug of war with each other.
[24:22] I'm just waiting for instruction.
[24:24] I'm more likely to survive than if I met with them individually.
[24:28] And then the next thing I did was then I, after I got them to be on the same page, then I got all 12 commissioners in the room from both sides of the aisle, and I let them sort it out in parallel.
[24:37] Though it wasn't enough to be managing up. I had to go back. I Had a. I had a group of more than 200 assignees, which meant they were assigned from different parts of the intelligence community.
[24:48] Their performance reviews were still being done by someone else, not me. They were still being paid by someone, not me. And yet somehow I had motivated them to do work.
[24:55] And so I got them together. And in that circumstance, what I chose to do was to be questioning and not questioning them, but just saying, so what do we want to do?
[25:04] So I didn't come in with answers. I didn't come in and say, clearly things aren't working out. This is now number four. I said, so what do we want to do together?
[25:11] And I let them have an hour in which they explored the space and I could only ask questions.
[25:16] And then afterwards I said, okay, for those who were present, we didn't have all 200 people present. Some were by video. But for those present, let's go get some lunch.
[25:24] So we broke bread and that allowed us to have some things. And then when we came back in the afternoon, I said, okay, so what are we going to do now?
[25:30] So I let them first sort of do exploration of what went wrong.
[25:36] But then after we had broken bread, it was shifting from, okay, you can't just be a problem holder, you can't just be a problem admirer.
[25:43] We now need to be problem solvers.
[25:45] And so this was that two prong approach, which is, can I get the upwards group to work out their differences as to where we want to go as a commission?
[25:53] And then at the same time, can I get the staff that in this case, I didn't even do their performance reviews because they were assignees. I didn't even pay them.
[26:01] They were assignees.
[26:02] Can we at least get to a page as to how we want to help the commission help itself?
[26:06] And the good news is, a year and a half later, our bipartisan commission report got praised from both sides of the aisle. And in the decade since, about 75% of our recommendations have been implemented and not rolled back.
[26:16] Knock on wood.
[26:17] Pamela Isom: That's awesome.
[26:18] Dr. David Bray: Yeah. So,
[26:20] so, but I mean, again, it, but it. The difference was how I structured it. If I had and I. This was only something I learned from the school of hard knocks.
[26:27] If I had made the mistake of meeting with each of the co chairs individually, which might have been well intended,
[26:33] then I guarantee you I would not have survived because I would have.
[26:36] They each would have thought that I was going to meet their expectations, and they would not have recognized that their expectations might be differing from their co chair's expectations.
[26:44] Pamela Isom: That is Good. So that's like meeting the moment.
[26:47] Dr. David Bray: Oftentimes you have to go to the balcony and a metaphorical balcony and say, what's really happening here?
[26:53] And my sense was, what was really happening here was aside from Congress passing the law saying that there needed to be a commission, which, by the way, the law had been passed more than 10 years ago and the money had only been appropriated to the commission 10 years later,
[27:08] but there had been no getting everyone on the same page saying, okay, so where do we want to go? And it was a balance between both sides of the political aisle, because that was the congressional leadership for the commission,
[27:19] but then also across the community for those who had been assigned to help make the commission happen. Where do we want to go as well? So often I say the best thing a leader can do is whatever the moment is, even if things are burning down around you, even if earthquakes are happening,
[27:32] say,
[27:33] okay,
[27:34] so where do we want to go? And the best leaders,
[27:37] you may have a plan and you may be able to ask people questions that maybe will help get them to where you're thinking and maybe you're going to learn along the way,
[27:44] but you're in the moment where someone else might have a better idea and you're willing to say, yes, I'm going to invest in that person and see where they can go.
[27:52] Pamela Isom: And that would be what would help determine whether leaders succeed or not, right?
[27:59] Dr. David Bray: 100%, yes. Because I think, again, the best leader. And Sue Gordon, who was Principal Deputy Director, National Intelligence after Stephanie o' Sullivan and is an amazing person as well, she says you gotta back your team.
[28:10] At the end of the day, you are just one person for these really difficult, thorny issues. It's usually 200, if not 2000 people or more that you've gotta back them as they move forward and get things done.
[28:21] Pamela Isom: That's awesome.
[28:23] I like that.
[28:24] So let's talk leading before clarity arrives. Because you have mentioned a few times before we started the podcast and this episode where you've had to work, where systems were already under stress or failure.
[28:42] So.
[28:43] And some examples, some of your historical examples fall within that category.
[28:48] So what did those experiences teach you when it comes to leading before the clarity arrives?
[28:55] Dr. David Bray: So the good news is I had early experiences early on where I've appreciated you often have to make decisions with incomplete information or, as you say, without clarity. And you also have to make decisions that sometimes at the moment may be unpopular.
[29:11] Getting back in your box. I wasn't doing agile development back in February of 2001, but had we not we would have had to handle things by fax when 911 actually showed up.
[29:19] A more recent so after I did several years with the intelligence community, I went to the Federal Communications Commission, which was interesting enough, they'd had nine CIOs in eight years.
[29:29] Always a great sign for CIO number 10 that things are just great.
[29:32] But also,
[29:34] this was not public knowledge at the time, but I can talk about it now. They had two advanced persistent threats that had infected their IT systems. And for those who don't know, advanced persistent threats mean cyber threats that we think were probably done by a sophisticated nation state actor.
[29:48] Yay.
[29:48] So as I told my wife, I can't even trust the IT systems I'm inheriting and the average tenure for the role I'm going into has been less than six months.
[29:56] Should be fun.
[29:57] The good news is using the same tactics of coming in with questions, shifting people from being problem holders to being problem solvers,
[30:05] and really just investing in people saying like show me what you can do in two weeks to fix this problem. If you make headway, then I'm going to give you more Runway for another month, another two months.
[30:13] You know, it's sort of like running like an internal venture capitalist activity, internal VC inside a government organization, believe it or not. And it was successful because you empower people and you motivate people with a mission.
[30:25] In less than two and a half years we moved everything we had to either public cloud or private hosting,
[30:30] saving millions of dollars to the taxpayer, but also hopefully getting rid of the APTS because we moved to public cloud infrastructure.
[30:38] And it was interesting because my initial theory when I went in, I went initially to talk to the Department of Defense at the time to see if they would host at least a firewall.
[30:46] After about 30 days of dealing with lawyers, it was clear that was never going to happen because title V title 10. So I had to adjust. But that's part of being a leader too is adjusting your ideas and your plans.
[30:57] But by 2017,
[30:59] so the good news was things had been solved.
[31:02] There was a high profile event where we got several public comments associated with a high profile issue and public commenting. For those who aren't familiar, this dates back to 1946 where anytime a regulatory agency is making a decision,
[31:16] they have to have either 60 days or 120 days where novel legal issues can be raised. So it's not opinions, it's not I think you should do X or Y.
[31:23] It's just novel legal issues that they have to address before they make their decision.
[31:28] Unfortunately in the last 20 to 30 years, it's been sold as this is a vote. It's not. This is an opinion. It's not. It's nothing like that. But we had seen in the past where most government agencies less than 1% get more than 10,000 comments.
[31:44] We had gotten several comments back in 2014 on the similar issue where we got in total about 2.3 million comments over 120 days, when most government agencies get less than 10,000.
[31:57] And 1.3 million of those comments came in the last week.
[32:00] Now, do I think they're all human?
[32:02] Probably not.
[32:03] But I had asked to use captcha to block bots.
[32:06] And the legal counsel said, well, if someone can't see and can't hear, they won't be able to file. That's a felony.
[32:11] No captcha for you. Like, okay. I said, could I use invisible means to detect bots? And they said, nope, because that looks like government surveillance can't do that either. And then finally I said, could I block the same Internet protocol address filing 100 comments a minute?
[32:24] And they said no, because one of those 100 comments might be real from that IP address.
[32:27] So you just have to drink from the firehose.
[32:30] So 2017,
[32:32] three years earlier, we got 2.3 million comments in 120 days.
[32:36] We got 2.3 million comments in the first week.
[32:40] And again, less than 1% of government agencies get more than 10,000 comments over the entire time period.
[32:45] Again, I don't need too much to say something's up.
[32:49] We were seeing 7,000, 8,000, even upwards of 9,000 comments a minute at 4:00am, 5:00am U.S. eastern Time.
[32:58] And again, I can't block bots.
[33:00] So the good news is we had moved to cloud. We spun up more than 3,000 times our cloud capacity.
[33:06] But the chairman's office came in and said, is this a denial of service?
[33:10] And I first surveyed my team and they said, that's above our pay grade. We don't know.
[33:16] And I said, well, if we look at this objectively, it is not a denial of service at the network layer. Nothing's wrong with the network, anything like that.
[33:23] But whatever this is, these manufactured comments, even though I can't say they're manufactured,
[33:28] are tying up the application at the application layer to prevent people from filing. And the good news is again, we are able to increase our cloud capacity, but it is essentially at the application layer, not the network layer.
[33:39] And I actually had a friend,
[33:40] Vint Cerf, who helped co invent the Internet and actually create TCP IP that I ran this by, and he's like, of course it is. Yeah. He said, definitely. It's a denial of service at the application layer.
[33:49] Well,
[33:50] I did not know at the time that possibly actors inside our own nation may have been involved, because the moment I said this, both of the aisles said, where's your evidence?
[34:00] And I said,
[34:01] patterns of Life. I'm getting 7,000 8,000 comments a minute at 4am, 5am and then their next question was, why didn't you report it to law enforcement? I'm like,
[34:08] no laws were broken. This is just a nuisance. But they haven't breached the system. This is not a cyber event.
[34:15] Back in the 80s, they used to flood us with postcards and mimeographs. And back in the 90s, they flooded us with faxes. So this is just the equivalent thereof.
[34:23] So anyway,
[34:24] for four years, people said, well, you know, the FCC made this up. It was manufactured. Even though in the end we got 23 million comments. So we got 10 times what we got three years earlier.
[34:33] 2021. Finally, after I kept my head about me, they came back and they actually concluded. The New York Attorney General concluded, of the 23 million comments we got,
[34:41] more than 18 million were manufactured.
[34:44] 9 million from one side of the aisle, 9 million from the other side of the aisle.
[34:48] Which is probably why nobody really wanted to hear my version of events, because.
[34:53] But in that moment,
[34:55] it was a choice between sticking to what I knew to be true or capitulate.
[35:03] And I often say leaders are ultimately assessed by the best decisions they make with the best information they have at the time, plus their moral courage.
[35:13] And so I was making the best decision I could at the time, which was, this does not look like it's anything legitimate,
[35:19] this sheer volume. I mean, I know people are passionate, but that passionate at 4:00am on U.S. eastern Time.
[35:24] And then two, the moral courage of.
[35:27] I believe I should always be a fierce nonpartisan. You don't want me to bring my politics to work, and I have them, but I don't bring them to work. They're personal when I go vote.
[35:35] But at work, I'm here to do the job of the Constitution and job of the US People.
[35:40] And so I kept my head about me, and fortunately, in the end, it all came through.
[35:44] But I also had to face the fact that in some respects, it may not have been. I was fortunate enough that in 2021 it was revealed.
[35:50] But I might also have to live with the fact that I knew the truth, even if nobody else did.
[35:54] Pamela Isom: That's a good leader.
[35:55] When you're able to recognize that something is not right and you're able to communicate that. And don't just go with the flow.
[36:09] It's a risk taker,
[36:10] but that's what we need.
[36:12] Right. So I'm sure you feel good about the decisions that you made at the time today, and that's a good thing. I tell leaders,
[36:23] I think you want to have a moral compass and you want to be able to live with that years later.
[36:30] Right. And so though I think that was a really, really good example and also a good example of how your government experiences really helped you and what you're doing today.
[36:45] We have so much that we have to think about when we're in government. Right. There's so much that you have to do. Taxpayers are paying for things you're trying to do.
[36:52] Right. By citizens. You're trying to protect our national security.
[36:56] I mean, you carry a lot and you don't have room to be partisan. You have room to solve problems and you have room to strategize on how to be innovative and come up with solutions that will address the challenges that the American people are facing.
[37:13] So that's good. And I'm so glad that you brought some of the issues to light because actually we didn't know that that was going on. So I think that will be insightful for others.
[37:22] Dr. David Bray: Well, and I think, you know, the way I look at it,
[37:25] this is a story as old as time. I mean,
[37:27] you can look back and the reality is it will continue. And the other thing that also I recognize and is like, if this is the only thing I have to give for my country, like I'm being asked.
[37:36] I mean,
[37:38] this is a small thing. I'm not being asked to give up a leg. I'm not being asked to give up my life.
[37:42] This does not feel good in the moment. This feels like I want to shout out to the news, here's what's really going on. But that would be the absolute wrong thing to do because I'd be giving oxygen to something that just move on, keep your head about you and keep calm and press on.
[37:53] But I was like, if this is the only thing I have to give for my country, then it's a small thing and this is what it really means to serve.
[37:58] I mean, I really do believe people need to recognize that when people sign up to serve, and I think a lot of people do. And this is where I wish we had more ways that people could either partner time or volunteer, do some active service for the country.
[38:11] Because I think the more People that serve,
[38:14] they'll realize a lot of people aren't doing it for anything other than just they really want to see the US and our people thrive. And a lot of that sometimes means working long hours or sacrificing.
[38:27] You know, you could easily go into, go into other roles that would pay more. But you've chosen this because it's a calling.
[38:33] Pamela Isom: It is misunderstood at times. But those that do serve know what, 100%.
[38:40] Dr. David Bray: Exactly.
[38:42] Pamela Isom: So if I think about national security,
[38:46] geopolitics and people centric, again, like what we were talking about does bring AI into the midst. Because I could hear AI helping solve some of the challenges that you were mentioning earlier.
[38:57] But if we think about how the leaders build awareness and make sound decisions amid AI driven disruption,
[39:04] what does that look like to you?
[39:07] So
[39:09] Dr. David Bray: I recommend to anyone right now,
[39:11] wherever you're at, whether it's a company, government agency, whether you're the CEO or you're a line employee,
[39:19] first cultivate a personal board of advisors.
[39:24] And that can be,
[39:25] it should at least include some humans. But increasingly you can actually use AI. Interesting enough if you use it right to say, what am I missing? You know, tell me what's missing from here.
[39:33] So it's actually a combination of both human and tech nodes, which oddly enough, when I, back in 2005 when I did my dissertation, I said, you know, it's going to be human and tech nodes making decisions together.
[39:43] So have your personal board of advisors 2.
[39:47] Ask when you have to make decisions,
[39:50] ask for both your humans and actually ask the AI too.
[39:54] Bring data.
[39:55] So don't just give me your here's what you think I should do or here's what you think I'm doing wrong.
[39:59] You know, you can ask if you do, if you use AI, right, it can actually give you sources. And click on those links, follow those links and make sure it's not hallucinating what it's reading.
[40:07] But you know, read that same thing for people. Like when I was at the FCC and had to make decisions, I was like,
[40:12] I want you to disagree with me. Just bring data,
[40:15] you know,
[40:18] then,
[40:19] then the third thing I say is go to the balcony yourself and say, given what I'm hearing from people,
[40:25] given what I'm hearing from, in this case an AI, given the data I'm seeing, what's my theory? What do I think is really going on here? Because I often say the nice thing about the balcony is when you're on the dance floor, you only see some of the movements.
[40:36] But when you go all the way up, you can see. Well, maybe they're dancing this way because something like this is happening over here and something like this is happening over here.
[40:42] So come up with your. Come up with the balcony view.
[40:44] And then the two last things. I recommend the fourth and fifth things. The fourth is decisional aestheticity,
[40:50] which I frame as.
[40:51] Ask yourself, how wrong do I have to be?
[40:54] How wrong do I have to be with my data or my conclusions to change my mind?
[40:59] And that's a good test because if you say, well, you know, if it's just, if it's really on a,
[41:05] you know, 50 50, then, then maybe I don't want to be at, you know, maybe I still have to take a stance, but I don't want to be as assertive or if I'm like, you know, even if I'm off by a magnitude of one or two, I'm still pretty certain here that that gives you something.
[41:17] And then lastly,
[41:19] ask,
[41:20] what are my pivot options if later I have to revisit my decision?
[41:24] So don't work yourself into a corner because I guarantee you more information is going to come in. This gets back to what you were saying. I was then at a.
[41:31] I got to speak. I was a speaker at a Wall Street Journal event that had general counsels of Fortune 500 companies and the general counsels of Fortune 500 companies identified, and they realized they're increasingly, as lawyers, going to have to make decisions with incomplete information, which you and I know,
[41:47] that's been the reality for the last 20 years. I celebrate that they're realizing it.
[41:51] But the reality is no leader's ever going to have perfect information,
[41:55] let alone all the information.
[41:57] So have pivot options if new information comes in and you have to update your decision.
[42:01] Pamela Isom: And that goes back to clarity. Right. So leading before clarity arrives, because things are going to change. Things are going to resolve itself and things are going to evolve. And so, yeah, that's exactly what I meant by that.
[42:15] I am to the place where we. This has been exciting. Exciting enough to where I think we can wrap it up here.
[42:24] I would first like to ask you, before you share your words of wisdom or a call to action, because that's what I usually ask for.
[42:30] But I want to give you space so that if there's anything that you wanted to share with me today,
[42:38] you haven't had an opportunity to please do so.
[42:41] So I'll give you space to do that and then share with me and the listeners your words of wisdom or a call to action. And I'm particularly interested in Navigating AI, cyber risk and global instability.
[42:56] What do we need to be doing? What's your words of wisdom and call to action as you think about that? You know, tell me if there's anything else that you wanted to share first.
[43:06] Dr. David Bray: Sure.
[43:07] So I think the one thing I would say, and this is my own personal philosophy, people may find it resonates with them, but if they don't, that's okay too. There was an individual by the name of John Rawls.
[43:16] He was middle of the 20th century philosopher and he had what he called the veil of ignorance, which is pretend. We don't know who we're going to be in life yet.
[43:26] We don't know if we're going to be born into an affluent family or not. We don't know if we're going to be born to a family that's got good health or not.
[43:33] But whatever it is,
[43:34] what would we consider to be virtuous, just and right in society? Because his argument was the moment you're born, you know, wherever you're born in life, that colors your perspective of what virtuous just and is.
[43:46] And I have found that to be the case when I worked on bipartisan issues. And I still think we can work on bipartisan issues, even if it's really hard right now.
[43:53] And to me, what that really translates into is try to walk a mile in another person's shoes.
[43:59] And I know that's hard, especially in an era in which a lot of people right now feel like they've had things happen to them, they've been hit, they've had some, you know, metaphorically hit, physically hit, whatever it is.
[44:08] And all I just recommend is you don't have to agree with the other person, but just try to see things from their perspective and don't automatically take the easy answer, which is they're wrong,
[44:19] because that's doing a disservice to yourself and the other person and everything like that. And so I just raise that because I think this does get to the era in which we're in.
[44:28] In which we're trying to figure out AI, we're trying to figure out cyber.
[44:31] I often tell people,
[44:32] you know, these are just tools. The machine is not conscious, as far as I know. It's just fancy multidimensional mathematics. It is not conscious or anything like that, but it can be used to help you, through story and narrative,
[44:45] see things from other perspectives.
[44:47] Now, again, recognizing the limits of generative AI, it will also hallucinate to you with confidence.
[44:53] So be careful in that.
[44:55] But you Know,
[44:57] using it as a way to expand your thinking beyond what you think you know is incredibly valuable. If anything, I right now have an 8 year old and my. The thing that I celebrate him is he's always asking me why.
[45:11] I love that.
[45:12] Because the reality is being curious, especially in a world that's changing. Like the world we're in right now is vital.
[45:19] So. And then words of wisdom or the call to action,
[45:22] be the change.
[45:23] If you think anyone else is going to come and be the change, then we might be waiting a long time.
[45:27] I think the reality is we have this myth of lone individuals being the driver of something successful happening. We like those stories, we like to go see the movie and it's, you know, it's Luke Skywalker.
[45:38] Whoever does everything saves everything. And the reality is we don't see all the people that were around them that made it so that they could actually do that.
[45:46] And so I think we have to overcome this epidemic of what I called learned helplessness, where we feel like issues are too big, too daunting,
[45:54] or maybe we've just resigned ourselves to being a victim. When I'm like, you can always choose in any moment, whatever happens to you, to be a survivor, a thriver or a victim.
[46:03] And I realize it's hard. I mean, I've, you know, we've talked a little bit about, you know, when you're punched in the face, it's hard to get back up, but you gotta get back up.
[46:11] And so I think we will get through this.
[46:15] Whatever this period of technological change is, whatever this period of geopolitical change is, whatever this period of just questioning society changes. And the way we will do so is when enough people say,
[46:25] you know, whether it's just having a conversation with someone who I've not spoken to before,
[46:29] learning something I've not learned before, or just taking the time and reminding ourselves that the end of the day we are still fundamentally human.
[46:36] I celebrate that we're now 8.2 billion people on the planet.
[46:39] The world is spinning. We are on it. We can be part of the change if we want to.
[46:43] Pamela Isom: That's awesome. Yeah, no, that's really good advice. That's a great way to wrap it up here. I really want to thank you for being here. I'm not going to try to repeat anything that you said.
[46:53] It's a great way to close this one out.