At The Boundary
“At the Boundary” is going to feature global and national strategy insights that we think our fans will want to know about. That could mean live interviews, engagements with distinguished thought leaders, conference highlights, and more. It will pull in a broad array of government, industry, and academic partners, ensuring we don’t produce a dull uniformity of ideas. It will also be a platform to showcase all the great things going on with GNSI, our partners, and USF.
At The Boundary
Can American Freedom Survive the Age of AI Surveillance?
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Text the ATB Team! We'd love to hear from you!
How can the U.S. government balance public safety and privacy in a world with AI?
In this episode of GNSI’s At the Boundary, host Jim Cardoso is joined by GNSI Senior Research Fellow Jeff Rogg to discuss the latest GNSI Decision Brief “Intelligence, Technology, and the Future of the American Republic.” Having written extensively on the history of U.S. intelligence in his book “The Spy and the State: The History of American Intelligence,” Rogg sets his sights in the Decision Brief on the future of intelligence, privacy, and democratic trust in an era of constant data collection.
During the conversation, Rogg emphasizes the historical relationship between technology and intelligence, highlighting the Cold War's impact on the US intelligence community's structure. He warns of the dangers of artificial intelligence (AI) and ubiquitous technical surveillance (UTS), which threaten privacy and national security.
Listen in on the conversation to get a perspective on balancing surveillance and security, as well as the ways the public can stay informed on this technology.
Episode Links
Jeff Rogg's GNSI Decision Brief, "Intelligence, Technology and the Future of the American Republic."
GNSI on X
GNSI on Linkedin
GNSI on YouTube
At the Boundary from the Global and National Security Institute at the University of South Florida, features global and national security issues we’ve found to be insightful, intriguing, fascinating, maybe controversial, but overall just worth talking about.
A "boundary" is a place, either literal or figurative, where two forces exist in close proximity to each other. Sometimes that boundary is in a state of harmony. More often than not, that boundary has a bit of chaos baked in. The Global and National Security Institute will live on the boundary of security policy and technology and that's where this podcast will focus.
The mission of GNSI is to provide actionable solutions to 21st-century security challenges for decision-makers at the local, state, national and global levels. We hope you enjoy At the Boundary.
Look for our other publications and products on our website publications page.
EP 118 - 16 February (Rogg)_Edit-1
Fri, Feb 13, 2026 3:09PM • 43:30
SUMMARY KEYWORDS
Intelligence technology, American Republic, GNSI decision brief, cyber beacon conference, international security program, career events, US intelligence community, artificial intelligence, ubiquitous technical surveillance, national security, public-private cooperation, Section 702, FISA, privacy concerns, future warfare.
SPEAKERS
Glenn Beckmann, Jim Cardoso, Jeff Rogg
Glenn Beckmann 00:12
Hi everybody. Welcome to another episode of at the boundary, the podcast from the global and national security Institute at the University of South Florida. I'm Glenn Beckman, your guest host today for at the boundary.
Glenn Beckmann 00:30
Today on the podcast, we're excited to bring back to the studio one of the country's leading voices on us, intelligence, dr, Jeff Rog. Jeff is a senior research fellow here at GNSI, and he's going to help us dig a little deeper into his latest GNSI decision. Brief, intelligence technology and the future of the American Republic, which leads off with this bold statement, technology has always shaped intelligence, but never has the relationship between the two been as consequential as it is today, Jeff's going to join us to explain what he means in just a moment. First, a couple of important notes. Applications have closed for the GNSI DC experience this summer, but there's still time to apply for another terrific study abroad program, the international security and intelligence program in the UK, GNSI will be sending students to isI for four weeks in July. IsI is part of the world renowned Cambridge Security Initiative. This opportunity is open to all USF students and will provide airfare, lodging and tuition costs local travel in the UK, as well as some meals. You'll find all this information on our website. The deadline to apply is this Friday, February 20. So if you haven't done it yet, get on your horses. We want to let you know that the cyber beacon conference on March 11 has been canceled. Cyber beacon is the flagship conference for the College of Information and cyberspace at the National Defense University. We were really excited to host cyber beacon here at USF, but unusual circumstances have caused CIC and NDU to cancel the event. We're hopeful that we'll be able to work with them on future cyber beacons as it is one of the biggest cyber security conferences in the nation. Finally, we wanted to tell you about a couple of career events we've recently hosted for USF students. We were pleased to welcome retired Ambassador Phil cosnet to campus recently, the ambassador is a non resident Distinguished Fellow for GNSI, and he's also one of the first fellows to participate in the mentorship program we've created for the members of the future strategist program. Cosnet connected with his mentee while he was here, along with a large group of other students to discuss his career story and their potential career pathways. Last week, GNSI welcomed Megan Booker with the Defense Intelligence Agency to campus for another student event. She also participated in a conversation on Iran with GNSI Research Fellow, Dr Armand makmudian, you know, educating future practitioners, that's you students, is one of our primary tenets here at GNSI, and these events are an important part of that, along with larger career events, like the one we're planning in April with the USF Office of National scholars and The USF Center for Career and Professional Development, that career fair is part of the 2026 international security experience in April. You can find more information on ISC and the career fair on our website. Okay, let's bring into the studio now our guest for today. Dr, Jeff Rog, you may remember Jeff was a guest on the podcast last year when we talked to him about his book, The Spy in the state, the history of American intelligence. Well, more recently, he's written our latest GNSI decision brief titled intelligence technology and the future of the American Republic. He sat down with Jim Cardoso, and the two of them discussed why this time, right now is a historical inflection point for us. Intelligence.
Jim Cardoso 04:09
Jeff frog, welcome back to at the boundary. Glad to have you back. Thanks. Good to be with you. Genjam, so I want to focus today. We're gonna focus on the present, the future. We've done the past before, but let's, let's get in the past just for a second to kind of set the stage. The US intelligence community, or usic, has always valued technology and innovation, and in fact, the Cold War competition only exacerbated that race for the best intelligence tech, however, and you wrote about that, it also led to a functional breakdown of the usic and a creation of what you term, a neat term. I think the intelligence information technology complex. Talk to us a little bit more about that, sure.
Jeff Rogg 04:46
So I say it started before the Cold War, with a functional breakdown in the intelligence community. The American fascination with intelligence and technology that's always been there. We're a tech oriented people and based. Basically, any time we've innovated a new technology, the intelligence community or intelligence organizations, whatever exists at the time, really quickly figured out how to use it for intelligence purposes. Look, as soon as the telegraphs was invented during the Civil War, the North and the South were doing wiretapping to read each other's messages. So that's always been the case. But the problem is, is, you know, our technology is also linked, and we'll talk about, I'm sure, the commercial features of it. It's linked to commerce. It's linked to contractors. There's money involved. And so if you're an intelligence organization, and there's this new technology, and it gives you some pretty cool capabilities, you kind of want to keep it to yourself, and especially when knowledge is power. Now imagine before you actually have an intelligence organization dedicated to that. Instead, you have a capability first, and you have all these other organizations, and they look at this bright, shiny object like, this is new. This is cool. They're probably going to compete over it. And so, you know, a great example that we I start with even before the Cold War was the evolution of signals intelligence. And why is the NSA, first of all, a single organization. And secondly, why is it under the Department of Defense or the Department of War today? Well, it's competition over what signals intelligence offered.
Jim Cardoso 06:15
Yeah. I mean, it was interesting. You talked about that. It seems like a lot of the tech intelligence seemed to conglomerate under the Department, whereas the overall, I mean, the US intelligence community was brought together under the Central Intelligence Agency. And it seems like there was a almost a separation between the two as the technology evolved.
Jeff Rogg 06:35
Yeah, that's the fascinating feature of it, too, especially when the CIA, at least when it pertains to another example is overhead reconnaissance, or satellites. The CIA has often been involved in technology, but then, like I said, they have to share the technology they're evolving. And so, you know, I just alluded to overhead reconnaissance. The CIA and the US Air Force were working on overhead reconnaissance together. So for instance, the u2 spy plane, CIA co develops it with the Air Force, and same thing with satellites. The issue is the US government and departments and organizations, they don't really share that well. And so
Jim Cardoso 07:10
I know my surprise, my
Jeff Rogg 07:14
surprise face, you know, after spending many, many years also in the US military. And so that's the issue, is they don't share that well. So, you know, at the time, it's like, oh, this is great. We're developing this together. But then when it comes to well, who's actually going to control it, who's going to use it, who stands to benefit in terms of personal and resources, that's where the competition comes in. And so I just was sort of talking around it. When the National Reconnaissance Office, the NRO, is developed as another intelligence organization responsible for satellites and overhead reconnaissance, there's a question of leadership, and you see divided leadership between the CIA and Department of Defense and but really what eventually evolves is you have to decide who actually owns it. Well, department defense. Same thing with the National Security Agency, the NSA and the other issue. And you had said, you know, why is it that the Department of Defense, we're still saying Department of Defense, but Department of War today? Yes, why is it that they tended to gain control over these tech organizations? Well, department defense, or Department of War, has a lot of money. They definitely have more money than, say, even the CIA, as the national intelligence programs allocated and the military intelligence program, and they during the Cold War, at least, were the main department or area of government responsible for ultimately confronting the Soviet Union. So at the time, it kind of makes sense, if you think about it, you know, who do we really want to have control these capabilities and these intelligence organizations? But my point, I think, today, when we look back on this, is, you know, even if something was done for good reasons, at the time, conditions change. And what's fascinating to me is for a country with a tradition where we're nervous about the power of the military. I mean, our founding fathers were and we're definitely worried about the role of intelligence in American daily life. Surveillance capabilities. Why would you talk more about why would you give such high tech and such extraordinary intelligence capabilities to the Department of Defense? I mean, you're even furthering empowering what is by far the strongest institution of in the United States government, and that's our military institution.
Jim Cardoso 09:26
Yeah, you know, you talked about the competitiveness and, you know, looking at the prison the future. So, you know, the US, it's going to be continued support of the private sector in this intelligence for in this technology, for intelligence purposes. But what, and you started talking about, but let's pull that through a little bit more. What are some of the challenges that arise from that necessity?
Jeff Rogg 09:43
Well, just as I had mentioned, as soon as we evolve new technology, we think of a way to use it to spy on people. Private companies have been involved in technology and intelligence innovation for our entire history as well. In fact, I just mentioned signals intelligence. Communications intelligence, the ability to access records. You needed private companies, because way back when they were still telecommunications companies or telegraph companies, then telephone companies, you need access to their databases and their records. So you can do one of two things. You can try and get them without them knowing, and that's probably, I mean, well, it's definitely malpractice, but that's a pretty shady way to go about it, or you can work with them. And so what Americans often didn't realize is that a lot of our we'll just call them spy programs, were done with the cooperation the complicity, of private corporations. But here's the catch. When you find out that the intelligence community or intelligence organizations are doing something wrong with it, and then you find out that private corporations were along for the ride, then what you see is these private companies start to pull away because they don't want to get dragged down with the intelligence community, and that actually hurts public private sector cooperation, which does hurt our intelligence operations and capabilities, and it's often in reaction to what the American people like or don't like. So you can see like we're building a pretty a pretty complicated and unsteady national security system where you need public private sector cooperation, but you have people who are looking at it and always a little bit nervous about what that means.
Jim Cardoso 11:22
Yeah, you know, well, let's, let's, let's kind of move forward from there too. And now the next big thing, obviously, is artificial intelligence. That's all. I mean, we could probably spend, you know, two hours talk about that alone. But it's, look, it's radically changing the consumption and usage of information across the globe. It's, it's intelligence is not immune from this revolution either in the ability to collect and analyze information to help guide national security decisions. You know, AI can supercharge the speed of analysis beyond existing human capabilities. It can, it can reduce that timeliness gap between disinformation the truth. You know, the old saying ascribe to Mark Twain, sometimes probably not accurate, but says a lie can, a lie can travel halfway around the world while the truth is putting on its shoes. How do you view the AI development and what are your concerns?
Jeff Rogg 12:11
Oh, this is great. Because, you know, on the one hand, we have the people who look at AI and they're saying it's gonna Elon Musk, for instance, Alex carpet, Palantir, they're saying, I mean, this is it like, this is going to so radically change human history. On the other hand, then you have people who are constantly telling us to pump the brakes, you know, they show you a machine and it it can't perform basic tasks. Or, you know, for the students who are listening, don't use AI yet to yet I said to cheat on papers, because the footnotes are usually wrong, like that cites books that don't even exist or pages that use
Jim Cardoso 12:47
other methodologies to cheat on papers. Is that what you're saying, do it the old fashioned way.
Jeff Rogg 12:51
It's okay. The answer is, don't cheat on papers. But, but you know, the point is, is that there's vulnerabilities with AI, or there's also just, it's still under development. I'm somewhere in between, because one thing that being an historian tells you is, yeah, change often is slow or unpredictable. But then the other thing that being a historian tells you is the people that don't think in terms of drastic change or worst case scenario or plan for it, at least, are the people caught flat footed ultimately, and so that's what I think about with AI, um, we have to think about worst case scenario, like, what, what are we looking at? What could it possibly do? And, yeah, you're talking about such radical transformations that maybe there's not a lot you can plan for, um, best case scenario, what if? What happened in terms of, oh, it's a lot slower than we think? Well, what happens if we're over investing in it, and we're actually it's going to make us less competitive in some ways. So from the Intel standpoint, you mentioned things like analysis. Well, what we already know about, AI, and I thought this was fascinating, is for the first time ever, and I think it was either sociologists, child psychologists, this particular generation, I think they're talking about Gen Z, probably not millennials is demonstrating slower cognitive development than the previous generation. In other words, in the past 100 years, since child psychologists, or whoever I you know, I'm on the spot talking about this, but since they measured the development of the human brain by generations, you've seen an upward development, probably because there's more access to information more but now we're so tech dependent that human brains aren't functioning the same way based on that that development. So what if AI becomes this incredible, seemingly advanced crutch, though, for analysts and analysts don't do their job as well. We're dependent on AI. But what if AI doesn't live up to what we the hype? And instead, our analysts, by the way, have now sort of decompensated in their ability to conduct analysis. Yeah, exactly your face. It's like, oh man. So in other words, maybe we're not there yet. But the thing is, and this is what I always go back to with AI, is it's like, who knows? Who knows? And if it does end up on the other side, radically transforming intelligence, well, then what is the role for the human in doing analysis? Or what is the role for the human, and especially for intelligence purposes? The US intelligence community, if machines, rather than people, are briefing the President, if machines are making decisions, we talk about keeping the human in the loop, if it's coming up with information that's radically changing human decision making. You know how much of the human is left in the loop? Yeah.
Jim Cardoso 15:28
And doesn't it seem that the speed, the required speed of analysis to get that information out, it's accelerating so quickly that I mean human analysis, it still has its place. No question about that. But I mean, a human can only analyze and put, you know, together. What is it an intelligence brief, or what have you, takes an amount of time. It just it just does, whereas AI, theoretically, you do it very quickly, theoretically, and is it, is that competition for getting accurate information out there faster and faster going to drive us having to turn to AI Well,
Jeff Rogg 16:04
and that's you've seen it yourself, you know, if you think about, for instance, military decision making. I mean, now you get information so quickly, it can be overwhelming. And so again, cost benefit, we have to think about what AI we might want to do, and what might actually end up being scary is, how quickly do you want to accelerate? And I guess, let's just say in military terms, like the kill chain, you know, how quickly before? And there's a great example, the Israelis used artificial intelligence. Their intelligence analysts were using it to develop targeting packages during the most recent war. And what they had said is, whereas before, they were not able to identify targets fast enough they didn't have enough analysts. You know, it took a long time now with AI, and they're identifying targets quicker than they can actually prosecute them. And so, you know, this is where I guess AI becomes like a self looking ice cream cone, because now not only is AI accelerating and reaching beyond human capacity, but perhaps even reaching beyond our capacity to exploit it. And this is where you need things like drones and UAVs and machines talking to machines rather than maybe people. We want to keep people in the loop. But what if humans are the vulnerability, the weak element there, we can't process information fast enough humans get tired. You know, machines don't? You see? What I'm building is a fairly dystopian and somewhat scary picture of intelligence and military conflict. And I, you know that that's where a lot of people are sort of headed when they look at the future warfare.
Jim Cardoso 17:35
Yeah. Well, let's, let's continue the terrifying motif here. Well, revisiting history momentarily. So the tension between secrecy and transparency intelligence is it's nothing new. It's going back, going back again. Reading your book, an excellent book, by the way, that everybody should get it goes back to the founding of the Republic, really. But this tension could be like supercharged with artificial intelligence and the tools available for Gat, for intelligence, special intelligence gathering, that's right. So what are the things to watch for in that with that inevitable, hard to say, inevitable expansion, I'm trying to speak as fast as Jeff does, and it's not easy. Everybody the inevitable expansion, especially you coined a term in your article, the ubiquitous technical surveillance, which is not a term that a lot of people have heard, but it's out there. So some I'd like you to talk a little
Jeff Rogg 18:28
bit about. So I didn't actually coin that we'd have to look at that intelligence information industrial complex, though, because I think I might have just come up with that while I was writing the decision brief. But ubiquitous technical surveillance is something that if you do not know this term and you're listening, you should know this term in intelligence and military circles, it's becoming it's not just trendy. It's becoming some a term that you need to know because of how difficult it is to perform intelligence operations, or at least in the future. What we're seeing is it's going to become more difficult than in the past. And what I'm talking about here is human intelligence. So espionage, traditional spying, ubiquitous technical surveillance, is basically everything around us, our phones as we sit here, recording, computers, televisions. Alexa, everything is sucking up information. Where's that information going? Who can get it? How can you use it? In other words, like we live surrounded by the technology that permits surveillance. And when that happens, and people are able to access it, and you know how to analyze it, all of a sudden it becomes a lot harder to hide. And so for human espionage, the success of our of the United States and in its ability to get we'd call them spies, but really case officers, or CIA officers, for instance, into a country is going to be much harder in the future if that country has things like biometrics, gate analysis. You know, this is where I sort of alluded to it with AI. But. Yeah, you know, I watch sci fi like the rest of us Terminator when we talk about AI. But in one of the mission impossibles, Tom Cruise, in order to break into this high, secure vault, the sticking point was gait analysis, how he walked, if he walked the way that he normally walks there, like you're dead, you have to walk like this person we're trying to imitate. So imagine trying to be a human spy, and forget about like your pupils, your eyes, your facial structures, your fingerprints, but just even how you walk, we can identify that you are not the person you claim to be, or you are an American intelligence officer if you're, say, like China or something. So that's what we're looking at with UTS. Now, the reason why I said people beyond the CIA or intelligence community should be familiar with this term is what I'm telling the listener is you're surrounded by this too, and so your privacy, you know, we talked about secrecy and transparency, but it's also secrecy. It's privacy and national security. Your privacy is diminishing. If not gone. Where do you think you can be private now, as an individual, you know if you're somewhere in the public sector, or if you're just walking around in the street, CCTV cameras, someone else's phone, your own phone, I mean, a car that you're in, electronic cars, cars that drive themselves. Everything's collecting information about you. So you know sure what I'm I don't want to alarm anyone, and you probably have much more to worry about if you are a CI officer, but that's only because we live in a country that hasn't turned that technology against us yet for surveillance purposes. If you're in China, I'd be much more worried. But the thing that worries me as an American is, what if we need to use what UTS allows us to because of, say, spies here. In other words, how do we establish a boundary between how we use technology abroad and how other countries use it and what we're willing to permit at home. And so that's one of the reasons why UTS scares me, just as an American trying to go about his daily life without being spied on.
Jim Cardoso 22:07
Yeah, the civil civil liberties pieces is a huge part of that. Like you talk about, you know, I was, I was talking to somebody that almost in the same sentence. They were, you know, they were talking about how, you know, my phone, it get, takes all this information from me, and it really bothers me. I wish I could, I wish that wasn't the case. And later in the same conversation, they said, Well, okay, I'm going to drive home now. So they pull up their Google Map, their Google GPS, to drive home so they can avoid traffic. That's right. And it's like, you go, Where do you think they get the traffic reports from? Right? Okay, it's going to have to come from you with your phone, allowing that to happen. So it's like, you know, people have sort of accepted, I don't say surrenders, maybe too strong a word, but they've accepted the fact that there is a level of ubiquitous technical surveillance in their lives to give them, you know, capabilities to give them. What's the word I'm looking for, simplicities of life that they didn't have before,
Jeff Rogg 22:59
convenient convenience. That's one of those things I work so it's surveillance by convenience. You know, there's a couple books and people are talking about surveillance capitalism. In other words, you know, like the capitalist system is kind of geared towards surveillance. But to me, it's less about capitalism as a system and more about convenience. People buy and access and use technology because it makes life easier, but then it also makes life more it makes you more susceptible to surveillance, interesting. So because we're going to, I want to bring a couple things together, because it's fascinating how the conversation has gone so far. So when you look at public polling the American people, and this reflects that old tradition, I said, we love technology, are still optimistic about technology. Okay, good start. However, when it comes to artificial intelligence, all of a sudden, by public polling, Americans become less optimistic and more concerned, and they're more concerned about how AI will be used. Now let's take this one step further when it comes to how private companies and the US government collect information. Americans are concerned about both. But here's the catch, they feel powerless to do anything about it. So when you combine those pieces, what you see is Americans are worried about AI. They're worried about government surveillance. They're worried about corporate or private sector surveillance, and they feel like there's nothing they can do about any of this. That leads to a much scarier picture, because we live in a democracy where we think that it's a government of the people and we should be able to vote and influence how our country evolves, how surveillance evolves, what our civil liberties are or will be, and looking at the future, my concern is you don't evolve intelligence capabilities to not use them, or at least the intelligence community doesn't. How do we constrain our own intelligence community? How do we constrain private companies from using surveillance technology in un American ways?
Jim Cardoso 24:55
And it's the requirement of policymakers and decision makers to do that. That it's funny, because we're sitting in a law library, actually, and we were talking about just how incredibly complex the laws on the books are, how sometimes it's challenging for a policy maker to make good policy because of the complexity. So that kind of, I mean, that kind of leads into that, that okay, we rely on decision makers, policy makers that we elect with the representative Republic, to make those policies that do protect our civility liberties, but in a more and more technological world, it's more challenging for them to be able to do that.
Jeff Rogg 25:30
That's right. And so a great example that's going on right now of this exact process, and it's been going on for several years, is the debate over something that, again, we talked about, UTS. People might not know what that is. It's called Section 702, of FISA, the Foreign Intelligence Surveillance Act. And this, this particular section is a legacy, a legacy of a program that was generally called the President's surveillance program, the PSP that was started immediately after 911 with the NSA. Why to keep us safe? Everyone was scared immediately after 911 NextEra attacks around the corner, the NSA rapidly started using and implementing intelligence capabilities that it knew we had, that a lot of legislators probably did not know we have, that our Founding Fathers, when they wrote the Constitution, had no clue would be possible. And so the issue with intelligence and technology is it often gets ahead of the constitutional skis. That makes good sense, because on the one hand, again, when the Constitution is written, who would have contemplated the technology we have. But on the other hand, when you develop intelligence capabilities, you want to keep those secret, because if you tell everyone, hey, American people, we have this capability, but we want to make sure it's okay with you before we use it well now it's useless because the enemy knows about it too. And so one of the trends that you often see in American intelligence history is that laws that legislators write and case law that the Supreme Court decides usually doesn't keep up with intelligence capabilities and technology. So the good news, not necessarily bad news about section 702, is Congress has been debating it, and there's legislators who are adamantly for it and those adamantly against it. The issue is, when you talk to national security, intelligence officers, people involved national security, they will absolutely tell you that it's essential this, this provision is essential. It allows us to identify threats, and that's great. The problem, though, is when you look at how it's been used, there's been a couple of abuses overreach by intelligence organizations, NSA and FBI, for instance. And so one of my responses, I suppose, to the intelligence community is, if something is important for national security and I want you to have that capability, then don't abuse it. So then we don't have to get to that point. But again, there's a tendency when you're in the intelligence community, is like, collect everything as much as possible, and oh, well, the law is not exactly clear. Let's hand it over to the lawyers. And then the lawyers kind of get you to Yes, or say, well, it's murky, or, well, we don't really know. Let's lean on Yes, because national security is at stake. And so you know, again, I'm trying to present the good and bad of section 702. It's being debated. I don't think many Americans know it's being debated. I don't think we, most of us, know what it is, what it does when it's been successfully used. And so you know that's that's the challenge and danger. Like you said, we're in a law library, and law is an essential area of government that so few people know about.
Jim Cardoso 28:30
But it is essential, that's right. So staying with UTS, you say that the US is in the decision brief, you said, the US is particularly vulnerable to UTS. What? Why do you say that? And is there a way to reduce those vulnerabilities? Sure.
Jeff Rogg 28:44
Well, it's vulnerable in two ways. One, we're an open society, and so we, we do have laws that restrict surveillance. They restrict government surveillance. I mean, that's also the Fourth Amendment, you know, is about unreasonable or search and seizure, warrantless wiretapping, I mentioned the President's surveillance program that was one of the controversial features of it, warrantless wiretapping, the collection of what's called metadata, which is data about data, so not necessarily a person's name on a cell phone, but their cell phone number. And then, once you put enough information together, you can identify the person, figure out where they are. So metadata. So with UTS, we're already susceptible because we're an open society, but we're also vulnerable because of how other countries are willing to use it this. So this is the double edged sword. We won't use it in certain ways that other countries will, and that means how they use it at home for their own citizens, which we already said makes it harder to operate if you're a case officer. But how will they wield it against Americans? You know, in other words, compromising individuals. You know, how do they recruit their own spies? Well, if they have a lot of information of you that you don't want made public, all of a sudden, you become more compromised personally, and then in terms of bulk, and this is something that's kind of scary. I sort of mentioned, like UTS, like ubiquitous technical surveillance, we probably also need to think about not just technical surveillance, but information more generally. And one of the areas that I find most it's a mystery, but it's this. Probably the scariest one is, why is a country like China collecting so much medical information on American citizens, our medical information your DNA. Why does another country need that? Why does anyone need information to your DNA? If you go to your doctor, it might be good you identify a cancer marker. If you go to a foreign country like China, well, all of a sudden, if we have your DNA, we have the most private, the most secretive, the most important information about you that you can possibly protect your DNA, and that, when you pull apart, shows you the vulnerabilities. So, you know, in terms of you, you know, the UTS is like this ability to just vacuum up everything. But I'm sitting here also saying, like, what else outside of just like technical things, like cell phones, our country is collecting in mass, and how are they going to use that? And here's the key word for targeting, targeting individual Americans, targeting us collectively, targeting groups of Americans. These are big mysteries for the 21st Century. Yeah, intelligence questions, yeah.
Jim Cardoso 31:16
And they may. And, you know, they may. They may not know why they're collecting it, but they can, right? So just go ahead and collect it, and potentially we'll figure out later why it makes sense to do that. But it goes back to all sudden. You could get to a point they come up with some, you know, you have the collection capability, but the action that goes with it has not been figured out yet. But that'll come. We'll get there. Yeah, it'll come at some point that goes, Oh, great. We got all this information. Now what we can do with it well, and
Jeff Rogg 31:42
that's the, you know, that's the complimentary thing about intelligence in the military that, you know, we've kind of been talking around and talking about, which is, it's not long, just as human beings get new technology and then figure out how to spy on each other, they get new technology and figure out how to kill each other with it too, or target each other with it. So, you know, when you put it together, it's a fairly scary view of the world that we're looking at. It's just collecting information on everyone. And like I said, even if you wanted to, you can't hide it. Even if you wanted to be the most private individual in the world, you basically need to be a hermit with no technology, and you can't go to a doctor, because they're going to take your DNA and then it's going to go into a lab, and then it doesn't just sit there in a vial, it enters a database, and then the database is only as secure as the database is secure, and so who's vacuuming this information up? This is where, again, as a historian, you like to try and tell everyone, well, you know, things have been worse and other points in human history, and they have and look at how technology is improving our lives, and it has. But then you look at human nature, and look at how we use things to hurt each other, and that's where, all of a sudden, I'm like, this is we're looking at a very different world in the future than I think human beings have have really known in the past.
Jim Cardoso 32:55
Yeah, and it's, it's unclear. I mean, there's, there's a lot of you know, lack of clarity out there what that could be, as we talked about with the collection capabilities, and then the activities that could follow. It still catching up, starting to maybe start to wrap up things here. So you know, all the GNSI decision briefs, they all end with four to five decision points. These are questions that policymaker, decision maker should ask themselves when they're forming policy or when they're even working with their advisors, their staff, whatever, to help them form a policy. So I don't usually do this, but you had some good ones on there that I assume you you put those so I'll, I'll ask you to take a stab at one of the decision points that you put out there. How should the US balance government transparency and the realities of an open society with the need to protect vital national secrets and activities.
Jeff Rogg 33:46
Oh, well, you know, if I could, our founding fathers didn't even figure that one out, because they hid things from the American people, you know, and this is like George Washington cannot tell a lie. He told a lot of lies, and he deceived a lot, or if he did tell lies, then at least he didn't exactly,
Jim Cardoso 34:05
but it was for a good cause. Come on, it's our father of our country, well, and that's so I think that's
Jeff Rogg 34:09
really the tension. That's the issue is we entrust our government with things like intelligence capabilities on the promise that they're not misusing it. Now who's to judge what misusing it is? Well, a lot of times public scandal and outcry. You know, hey, you misused this, and the government looks at us and it's like, but you asked me to do it at the time because you were scared. You know, after 911 when you look at public polling, we were asking for the programs that were implemented, torture a terrorist sure, as long as it keeps me safe. I'm actually paraphrasing what one member of Congress essentially said, and that member of Congress later, after hearing about the rendition and enhanced interrogation program, was outraged. So, you know, the American public is kind of fickle about the programs we'll support at different points in time, because generally, we're only as, oh, I don't know if moral is the right word. We're only as good as our. Condition. You know that this goes back to Thucydides and more, lowering a man to his best condition. In other words, you're only as happy as your current position makes you. So the key is, and I sort of alluded to this with FISA section 702. Is, if the American people are going to entrust you with secrets and with secret capabilities and high technology, then don't abuse it. And again, that's the key. Is, don't abuse it. Well, how do you know what abuse is? Well, in some cases, it is sort of like they call it the New York Times test in intelligence, which is, don't do anything that you wouldn't want seen on the front page of The New York Times, that you couldn't defend on the front page of The New York Times. So you know. But as I just said, public temperament isn't the best judge of that, because what people read in the New York Times will accept one day and not another day. Ultimately, what it comes down to, we're never going to get this balance exactly right. That's what the history tells me. We'll never get it exactly right. You do the best you can again. It's a balance. It's not perfect, but I think what I what my warning shot is, my parting shot in this is how we've balanced that in the past. Our ability to balance that in the past, which hasn't been very good, is about to get a whole lot worse, because the technology and the potential for it, what you know worst case scenario and vision is just that bad. It ultimately UTS surveillance capabilities, how other countries will use it, even if we don't want to ourselves, is radically going to transform the American way of life, a way of life in which we've known a degree of privacy, or at least we've thought we had a degree of privacy, our susceptibility to our need for technology and private corporations and giving them information. I mean, we just give up all kinds of information to our phones, to Amazon. We want that convenience. We're looking at the future where all the convenience that we've enjoyed for so many decades now through new technology and information technology is going to be used against us one way or another. It's going to be used against us whether it's whether it's private companies, our own government, foreign governments, terrorist groups, non state actors, cartels, you name it, it's going to be used against us. I'm not sure we're going to be able to balance it out the same way we have in the past. We're sitting here. I went to law school, you know? I studied the Constitution and constitutional law. Love the Constitution and what it's allowed us to do so far, I just don't know if the Constitution is going to be a flexible enough document for the future of intelligence and technology.
Jim Cardoso 37:34
Well, sometimes we don't know that something was, and I hate not the words good or bad or right or wrong. But sometimes you need the historical lens to really view it more dispassionately, to look back at something that like you even talked about yourself and we lived through the night. You know, 911 times, I was in the military when 911 happened. And, yeah, there was a certain mood that gripped the country that time that you now you look back in 2025 I go, I mean, there was some, a lot of good that came out of that, but there's some practices just go really, really. I mean, you can even look back at not that long ago with covid, and covid instituted some things that, again, you know, there's all there's many different arguments, but there's some things you look back and go, Wow, that may have been a bit extreme. Or, you know, like I said, it depends on on your point of view. So you wonder if a lot of these things that you talk about, these things that are happening now, or even things that you prognostic prognosticating to happen in the future, we won't really know, and we won't know that we're crossing that Rubicon into an area that we're in a bad place, until further down the road that we go, oh, wow, looking back, and what is our state at that point when we're In the future, looking back, or in a state that we have the we are afforded the luxury of looking back. You know, I know that we didn't, I didn't set out to be dystopian today. But I mean, when you talk about intelligence and tech and civil liberties, I mean, it's something you got to think about, right?
Jeff Rogg 38:53
That's right? And, you know, again, just for the purpose of the listeners, I am painting a worst case scenario, because what we do, what we do here, what you and I have done in the past. And it's really we'd be irresponsible if, as National Security oriented researchers, practitioners, professionals or officials, you don't present the worst case scenario. Like that should be your starting point, because as long as you choose the worst case scenario, then technically, if you think about it, can only get better. You know more it stays at the worst case scenario. And then, you know, but, but that's our responsibility. So I, I wanted to sort of present the worst case scenario, because, again, of that keyword we already talked about, convenience. People are just rushing headlong into technology because of all the conveniences it offers. And you have the optimists who are like, this is good. Don't worry, it's gonna get better. And then you have the pessimists who are like, you know, we're heading towards you choose the sci fi movie Terminator. And, you know, I don't know, but I'm probably somewhere in between. But I had to present the worst
Jim Cardoso 39:52
case, and that's what will probably be somewhere in between. Where along that spectrum will it be? It's kind of hard to say, but it'll probably be somewhere in between. I would. Guess if someone were to ask me to to foment my guess? Any final thoughts before we end the podcast today?
Jeff Rogg 40:07
I think this particular conversation and mentioning things just even like section 702, FISA, and what legislators are doing or not doing again, if the American people, by polling, are saying they're concerned about this. They're worried about AI. They're worried about government steal stealing or taking their information. They're worried about private companies stealing or taking information. And then here's the key piece, they feel powerless to stop it. That's the biggest problem I have. Is we live in a democracy in which our fellow citizens and us feel powerless to actually influence what our government does to protect our privacy and our information. That's like probably constitution 101, type stuff and elemental to the democracy that and the Republic that our founders created so that, more than anything else, that public awareness and figuring out how to get the American people more involved in their intelligence system is probably my personal mission as an intelligence academic, and I would argue, based on our conversation today, one of the more important things that we can do as a Republican at 21st Century,
Jim Cardoso 41:14
good way to end up. Jeff frog, thanks for your time today. Really appreciate your insights. Thanks. Jim
Glenn Beckmann 41:22
special, thanks to our guest today, Dr Jeff Rog, and to Jim Cardoso, Senior Director here at GNSI, for conducting that interview with him. Raag is an accomplished author and scholar on the US intelligence industry, and he also published our latest GNSI decision brief intelligence technology and the future of the American Republic, which you can find on our website. Jeff's a great storyteller, and his books and this brief are well worth your time. We highly recommend you check them out next week on at the boundary, we're going to kick off our latest research initiative, the path to durable peace in Ukraine. Our own strategy and research manager, Dr Tad schnaufer, will collaborate with Dr Golfo alexopolis for that project. Golfo is a Senior Faculty Fellow for GNSI and also the director of the USF Institute for Russian, European and Eurasian Studies. They'll both be in the studio next week, and we're looking forward to hearing them kick off this new research initiative. If you don't want to miss that episode or any other episode, be sure to subscribe to the podcast on your favorite podcast platform. We know you have virtually unlimited choices when it comes to choosing which podcast you're going to spend some time with, and we're grateful to you for parking some time for us for at the boundary. Thanks for listening today. If you like the podcast, please subscribe and let your friends and colleagues know and tell them it's well worth it for them to subscribe as well. You can follow GNSI on our LinkedIn and X accounts at USF, underscore, GNSI, we're also on YouTube, and you should check out our website as well at usf.edu/gnsi, you
Glenn Beckmann 43:06
that's going to wrap up this episode of at the boundary. Each new episode will feature global and national security issues we found to be worthy of attention and discussion. I'm Glenn Beckman, thanks for listening today. We'll see you next week at the boundary you.
Podcasts we love
Check out these other fine podcasts recommended by us, not an algorithm.
Fault Lines
National Security Institute
Horns of a Dilemma
Texas National Security Review
War on the Rocks
War on the Rocks
Why Should We Care About the Indo-Pacific?
Ray Powell & Jim Carouso