Incongruent
Podcast edited by Stephen King, award-winning academic, researcher and communications professional.
Chester | Dubai | The World.
Correspondence email: steve@kk-stud.io.
Incongruent
Securing AI, Securing Us: Cordell Robinson
Spill the tea - we want to hear from you!
A former Navy intelligence professional turned engineer and lawyer, Cordell Robinson joins us to map a sane path through AI-era cybersecurity. His core message is blunt and hopeful at once: people and process still win. He explains how Brownstone Consulting trains a “cybersecurity army,” turning policies into habits and inventories into living maps of what actually runs inside a business. From there, we tackle the messy reality of AI agents, meeting note takers, and shadow integrations that bypass controls because they’re convenient. The fix is practical: isolate, scan, verify vendors, and only then connect.
We go deeper on executive accountability. Many leaders stopped learning tech years ago, yet they are still the ones called to the carpet when things break. Cordell lays out a workable literacy plan: sit with your technical teams and learn the flow of data, the purpose of each tool, and the governance that limits risk. He doesn’t demand a CS degree. He asks for just enough fluency to ask sharp questions and to stop approving integrations on trust alone. That shift, he argues, is the difference between fragile productivity hacks and secure innovation.
The stakes are rising. Cordell warns that human haste, not AI itself, drives the biggest threats, forecasting potential hits to critical infrastructure by 2030 if we fail to educate widely. We explore ethics and bias in code, sector-specific governance, and why constitutional-level updates should enshrine privacy, anti-bias requirements, and transparent AI controls. On a brighter note, Cordell shares his Shaping Futures Foundation, where children in Tanzania learn STEM and life skills to compete globally. It’s the long game: build habits, build literacy, and build leaders who can handle the tools they deploy.
If you enjoy thoughtful, practical takes on AI security, subscribe, share with a friend, and leave a review. Tell us: what AI tool are you still unsure about adopting?
And welcome to another episode of The Incongruent.
Lovell Menezes:My name is Stephen King, and I am joined with Lovell, a former student of Professor Steve.
Stephen King:Right. So, Lovell, who did we speak to today?
Lovell Menezes:So today we spoke to Cordell Robinson. Uh Cordell Robinson is in the cybersecurity space and he is a former veteran and he's worked in the Navy and he's doing a lot of interesting things in the cybersecurity space. We we had an incredible uh conversation with him, and uh he he goes in depth about uh all these topics.
Stephen King:Yeah, we could we even asked him, we challenged him to be the new president of the United States or at least to be part of the governing body and to write new constitutions. We talked about how governments need to be very rapidly changing uh their way of running themselves because of AI and the security implications that they had. Uh, we're really, really grateful for everyone if you like it. Uh if you like our content, please do like the podcast and please do comment, leave us a share, tell us, tell your friends, uh leave us a nice review. But uh in the meantime, hold on to your hats because this is the scariest podcast that we're having this season. It's on security issues related to AI. And if we're ready, Lovell, are you ready? Okay, yes, you are. He's nodding his head in the mute. So here we go.
Lovell Menezes:Good morning, good afternoon, good evening. Today on Incongruent, we're joined by an extraordinary figure in the cybersecurity space, Cordell Robinson, CEO of Brownstone Consulting, with a rare blend of military discipline, legal training, and deep technical expertise. He's built one of the most innovative veteran and minority-owned cybersecurity firms in the United States. His career began in the Department of Defense and later in federal civilian agencies like the National Weather Service under the top of the Department of Commerce, where he helped redefine cybersecurity compliance and governance. What truly sets him apart is his ability to translate complex regulations into clear, repeatable systems that executives and technical teams alike can understand. Outside his professional work, he leads the Shaping Futures Foundation and is now driving automation and AI adoption to make cybersecurity accessible, not just to experts, but to decision makers shaping innovation. So, Cordell, you've been one of the most uh multidisciplinary profiles we've seen: military, technical, legal. So, could you walk us through your life journey and how you got to where you are today?
Cordell Robinson:Sure, definitely. So it's been very interesting and a very fun. So when I uh graduated high school, I went to undergrad um and got my bachelor's in computer science in electrical engineering. Um I wasn't ready for the world yet. So I joined the United States uh Navy um and went into naval intelligence. Uh, spent some time in Diego Garcia, which is in the middle of the Indian Ocean, and then I uh went to flight school and then ended up in Rota, Spain, where I did ops over the Adriatic Sea uh during the Bosnia-Herzegovina conflict. Um after that I got out and moved to Washington, D.C., which is the nation's capital here in the United States, um, and went to Georgetown Law. Um, while I was in law school, I was working as a software engineer for the Department of the Army. So I learned a lot there. Um, and then, you know, when I graduated, I was like, well, I don't want to be a software engineer um anymore because it was cool, but I wanted to do something that was going to be uh very like it's gonna be needed for many, many, many years and it's gonna, you know, pay very well. And so um I got assigned some uh cybersecurity duties there. Um and then once I moved over to uh weather service and then to commerce proper, um, I was able to utilize uh my legal training um as well on top of my technical training. So it's been a great journey.
Lovell Menezes:Wow. It's it's very interesting how you made so many transitions and how you started off in the Navy. And today, you know, you're in my knowledge of Brownstone Consulting is you know, it's described as you know, a veteran or a my minority-owned cybersecurity services firm. So what would that identity mean in practice and what problems uh you know do you solve with that identity?
Cordell Robinson:So, what did identity mean? So it shows that, you know, uh veterans, you know, which are former military, um, that we are multifaceted because one, we've traveled the world and experienced the world and experienced so many different people and cultures because you know, we just come together from like all over the country to serve our country together. And so you get to know so many different people. Um, there's great training in the military, there's great exposure and experience. So having that veteran as a business owner is extremely important. Um, a lot of organizations really like to see veterans running businesses after we've served our country because we have um such a broad knowledge and not just academic knowledge, but we have lots of knowledge of the world and how to deal with um and communicate with all different types of people on so many different levels because we've learned, you know, learned so much. And so it's it's amazing and to me being, you know, African American or Jamaican African American, um, you know, I'm a minority, considered minority here in the United States. Um, and it's really um, you know, sometimes it can be some challenges, but uh, you know, I think it's really um a lot in the grit that you have and what you put into it here. So I think it's just I I think it's great to be, you know, to be able to have that designation and you know, especially like the veteran designation, um, and have served my country so that I can bring those values of my culture and my military experience to the uh corporate world.
Lovell Menezes:Yeah, I I I think it's absolutely great as well, you know, to share your identity with uh with all of us. It's just wonderful, you know, uh the story that you've shared with us so far. But it uh could you you know go in further about you know brown brown brownstone consulting and you know how how how you tie all of this together as as a firm in terms of the services that you all provide?
Cordell Robinson:Sure. So uh when I was working at uh Department of Commerce, um I helped a company a substantial amount of money. So uh uh another colleague of mine, we were both working on some projects together, we were helping win business together, and so we were like, you know, we should, you know, once we're done, you know, start our own um firm and you know, see what happens. So we, you know, did our homework. We started our own firm back in uh 2010. Um and it was a rough start because uh we had to learn a lot, we had to navigate through many things, but we did know how to build business. And that was that was the thing. We knew how to bring in the revenue. So that really helped us move along. And so what I wanted the core values of Brownstone to be is your go-to cybersecurity services firm for compliance on every level. And so we always stay on top of the latest and greatest technologies. We're always taking training. Um, me, even as an executive, I still take, you know, not just executive leadership training, but I also take um, you know, uh different types of technology training. Like I'm taking AI classes every single month to understand AI in and out, and especially AI security uh training, so AI governance. So it's really important for Brownstone and then all of my employees, making sure that they are properly trained and they understand the latest and greatest so that we can service our clients and give them the highest quality and best services possible because I want to make sure that the reputation of Brownstone globally is one of your companies that you can go to, and that's gonna really help educate your organization, whether it's a company, whether it's academia, whether it's um government, and we can really bring some insights to you. So I built an ecosystem. I self-like my me, myself, I've trained basically a cybersecurity army. I would hold held host classes at my home every single weekend, you know, for hours and hours for over like uh you know, around four or five years. So it's like right before COVID and then during COVID, we took advantage of that, you know, that time period and we really ramped up the training. Um, and I, you know, really drilled into the trainings and then brought them onto my company, gave them positions, and they when we um my company has a reputation that when I uh provide you know people to different uh clients, different contracts for either government or commercial, we're considered the best of breed. Um, we always get the highest praises and accolades because um everyone takes, you know, I make sure everyone takes pride in their work, everyone is very knowledgeable, um, and we do our due diligence and we make sure that the client, we walk away from the client where they have one and more secure environment, but they're also well educated. So it's very important for Brownstone.
Stephen King:Now, as a small business owner myself, uh I'm having to hold the hat of IT security person as well as everything else. And this is not something which I've ever wanted to do, but I'm forced to do because of compliance with certain contracts with suppliers with my clients. And uh the the sheer simplicity of some of these security tips is is like, for example, I have to keep an asset list now, which I've never done before. Uh I have to make sure they talk about you have to have a policy in case there is a risk, uh in case there is a breakdown, and we say, well, I can well I'll deal with it when I deal with it. So these are the simple things that I have to deal with as a small business owner, but you deal with so much bigger uh problems. I mean, right now you mentioned AI, as far as I see on all my feed, is that the agents, these AI agents, are a massive security risk. Uh is that something that you're discovering, or how how are you uh what are you tackling at the larger end? What's the equivalent of a you know simple uh asset list for someone who's got all these agents and uh AI tools and people bringing in their own tools into the office? What what how are you handling that? What are you what are you talking to people about?
Cordell Robinson:So the first thing that we do, well, the it's kind of twofold. So the priority is making sure that the people are properly trained because the biggest threat to AI is humans. But then when it comes to assets and asset management is extremely important. So I found that most organizations, whether small or large, their asset management is not there. Like they don't, either they don't have it or it is very, very inaccurate. And so I say, okay, if you're gonna bring a new technology in, have your asset management list. Have a process that as soon as you bring it in, you add it. Don't wait because then you'll forget, and then it's there, and then you scan your environment, and then you have all these things. I've been to huge organizations, scan their environment, and they're like, I didn't know I had this on my network. And I'm like, well, then you would know that somebody is sitting in your environment collecting your data. So making sure that the in inventory is one of the biggest issues that I've seen at some very important places. I'm like, you know, do a touch inventory every, you know, maybe once a year, every two years, where you literally touch every single, you know, software and hardware that you have and make sure you have your line items. And then make sure that you're you, you know, if you haven't implemented that process, then go ahead and every single time someone brings something in, you add it to there. And don't allow people to just bring things into your environment. So put in some measures in place where they just can't install software. Block them from installing software and only let certain people install software, but they still have to fill out a form to install that software and it has to go through an approval process. Even if you're the only only approver, at least you know. So then you can check it against your asset manage your asset list and say, okay, um, they want to bring this in. Um, sure they can. You know, I'm gonna add it or whoever you assign to add it, let's add it right now. Um, or no, and then you check to make sure, you know, well, one, you put measures in so that they can't do it, but if they get around it and they do it and you scan and you see it on there, then you make sure that you, you know, handle things accordingly.
Stephen King:So we can do that for humans, right? If a human brings in a piece of software and downloads it, then uh certainly, and there can be accountability. But these new agents, they they they use all we don't even know what they I don't know what, I I assume that when I ask chat PGPT or I ask uh Claude or someone like this to do a uh someone um through turning into a person there, but uh to do something like create me uh I I asked it to create a scatter chart the other day and it started programming Python, etc. etc. Like um how does a company take the benefit of these agents when the agent is going to operate autonomously and might not have these guardrails? Is that a problem or is that something which the I a good IT manager will solve?
Cordell Robinson:Um it's a problem. So I tell people don't rush into putting these things into your you like into your production environment that's gonna touch everything. So see if you can you have like a system or a laptop that is not connected into your environment, utilize it outside your environment, work with those agents, make sure it's secure it's secure, get you know, do whatever you need to do, you know, make things efficient or whatever work you're doing. Scan it, make sure it's clean and secure, you don't have any um bad agents, and then bring it into your environment. I know it's an extra step, but it is a more secure way because if you like so many companies, they just oh, I'm gonna use chat. No, like don't just do that because you don't know that data, you don't know where that data's going, and you don't know who has put agents on that data to look into your environment. Like you don't know, right? And so just utilizing it because it's easy, it's simple, it's making things work faster for you doesn't mean as it's going to help you when it comes to the security of your environment. So I say do your due diligence first. And like, like for me, like I have literally have a laptop that I use like AI for that's connected to a separate network. I do everything there, and then I do a scan, and then I bring it into my network.
Stephen King:And what are you actually what are you looking for? I'm gonna sorry, I'm just gonna what what do you look for? Or what do you find? Um I have heard that these note takers and these calendars can be tricked into sending messages out. You you you you attach your, I won't name a brand, but they get attached to your calendar, uh, and then someone can then send a message to the note taker to send you the send them the minutes of the file or something like that. That's something I've heard. Uh what do you usually find? What's what's the common problems that you find when you're when you when you're digging deep into this?
Cordell Robinson:So that's one of the problems, and so that's a big one because the note takers are so great. Like they are really good because you don't have to have someone taking minutes or notes and they just have the note taker, it's gonna do it at the end of the call. It's all organized for you. So it's like such a wonderful tool, and everyone wants to use it, right? So that's it's it's a wonderful tool. But like you said, people can attach things to it and get into your environment. So I say do an audit of the note taker before you decide to put it, bring it into your environment, right? So look, so reach out to the company. So there's like Fathom, um, there's uh what Gemini, and there's several other like Zoom has note takers. Um, and so reach out to them, reach out to the company and ask them for what you know, what is your security policy? How are you securing this if I'm gonna put this in my environment? And they'll let they'll tell you. And a lot of people don't do that.
Lovell Menezes:I I think it's it's great to ask such questions. It's something that's never uh, you know, come to my mind when it comes to the security of uh security and privacy when it comes to AI. And uh you mentioned about wanting to make you know cybersecurity accessible, not just to experts, but to decision makers as well. Uh so, in your opinion, what do you think you know are the biggest barriers keeping executives from you know understanding or uh you know acting on cyber risk?
Cordell Robinson:I think the biggest issue is there's a lot of executives, especially some of the older executives, they're not tech savvy right now. And they, you know, the technologies that they've learned are from maybe the 90s or the early 2000s. And because they've been in leadership for so long, they have not kept up with the latest technologies. So they're apprehensive to even learn the new technology. So they just use the easy things like chat or the note takers, and that's it. And I'm like, well, you I say sit down with your um technical folks, have like one-on-one conversations and learn a few things about how the technology is being used in your environment. So you as an executive know what's going on. Because if something happens, the leader of the company's called to the carpet, not that, you know, IT person. So they they should understand that they're they're gonna be held accountable. So if I know that I'm gonna be held accountable, I want to learn what's going on technologically in my environment. And I want to understand the technology and not just know, okay, I have, you know, this list of tools in my environment. Okay, that's great. But how do these tools work? Are these tools secure? Is there governance wrapped around these tools? Do I understand the governance wrapped around these tools? Do I understand the technology? And you don't have to understand it as much as a leader and executive as much as your technologists and your company, as your AI engineers, but understand it enough to have those conversations and understand it enough where you can embrace it and not kind of run away from it. I mean, because a lot of, for some reason, a lot of the, you know, some executives are like, uh, well, you know, I'll just leave it to, you know, the younger folks, they understand, they handle it. That's just, it's just too complicated for me. It's not. I mean, we're gonna learn until we die, right? So it's just, you know, let's continue on. It just makes things, and it not just makes your life professionally easier, it'll make your personal life easier as well by understanding technology because we live in a very techy world and it's getting more tech, you know, we're it's becoming more and more and more filled with technology. And so if you don't, then you're gonna be left behind. And even when you retire, you still have to understand technology when you retire because everything is on a phone, a smartphone, or on a laptop or in a tablet. It's just there. I mean, when you have to book travel, I mean, are you gonna spend hours on the phone or are you on on the phone talking to someone? Are you just gonna go on your phone and quickly book? You know, I mean, what do you you know what are you gonna do?
Stephen King:And here here in the UK, if you want to get health service, it's all done by automation now. You have to go through the before you speak to a doctor. So there's so as as you rightly say, the older you get, the you know you need to have this technical saviness. Uh Laval, what we got next?
Lovell Menezes:Uh it it's a f a follow-up on what you were talking about. So uh it I think it's absolutely necessary for all of us to keep learning about AI and keep adapting to these new changes because as you said, you know, all of this is here to stay, and we are always going to be using these technologies. So i in in the world of cybersecurity, do you see AI being you know a big threat, or do you see AI you know being an ally for you know the world of cybersecurity?
Cordell Robinson:Right now it's a big threat. And it's a big threat not because of AI, it's a big threat because of humans, unfortunately. Um, I think AI is probably one of the one of the greatest inventions in our modern times. But unfortunately, what I'm seeing globally as humans, everyone is running and racing towards it without doing any due diligence. And it hasn't caused huge issues just yet, but I see them coming around the corner. So, you know, by 2030, we're gonna have some major problems, or even right before 2030, some major problems. If people, humans, we don't get educated on it. And that's the biggest thing is to uh make sure that we all of us humans, everyone, no matter how young to old, get educated on AI and mainly the security of it.
Stephen King:We've had the millennium bug. I remember the millennium, you remember the millennium bug? Yes. Lovell wasn't born. No, I do not know the millennium bug. So what is the equivalent? What are we looking at? What what do you predict as being the millennium bug moment with uh AI security?
Cordell Robinson:Um millennium bug moment is gonna be a lot of uh power grid outages due to AI. Yeah, it's coming. Okay, about the coal. That's that's good.
Lovell Menezes:And you know, to you know, just to go on with you know uh our previous point, um what uh ethical boundaries do you think we need when uh embedding AI into security operations, especially in uh you know government or defense sectors?
Cordell Robinson:So uh I think the boundaries we need is uh processes and procedures that everybody follows and they understand clearly. So, you know, write them in simple terms so that executives and non-executives and technical and non-technical understand across the board instead of writing it to technical, and then enforce those policies and procedures. Make sure they actually work for your environment, whatever environment it is, because the academic environment is very different than the government environment, which is very different than the corporate environment, which is very different than the medical environment, and then your manufacturing, because it's so many different environments. And so creating those things, which that's what my company does, is that's one of the things we do is create those things, you know, those policies and procedures are so important, and then enforcing those are so important because one, it educates people, and when you're forced to actually follow them, you get into a culture where it becomes second nature, and not only do you adopt those policies and proceed procedures professionally, but you begin to adopt them in your personal life because also personally, AI can either be great or do lots of damage financially to you personally.
Stephen King:Well, that's I've already subscribed to so many different tools, so I don't know whether that's uh whether you mean other. Uh we just saw what did we see today? Sam Altman has issued something about this. Uh they when Sora had issued uh uh uh a whole lot of things, and then they they they withdrew some of the chatbots type of functionality, and now he said he's gonna treat adults like adults. I th I I see that. Uh I don't know whether that's got to do with security, but that that's that's definitely an ethical concern. Um we've had a previous speaker talk about vision computing and talking about how in vision computing they no longer need to take biometrical uh they don't need biometrics to detect who you are. They can detect from your gaze, they can from your your physique, um and they can measure intent. Now you're in the US where you have freedom of thought or freedom of First Amendment rights, but now with the use of AI and security devices, we can actually measure your intent. Uh so philosophically, that fits into this particular question, is and to the next one about constitution. How does this uh uh affect personal freedoms when I don't need to quiz you on your thought, I don't need to quiz you on your political beliefs? I can synthetically determine your intent by the way you look, by the way you walk, by the place you are, uh by your historical uh moments. Um how should we be rewriting the constitution of a government or to to protect citizens? And what would might be your first clauses if this was to be coming to effect? I've made you president.
Cordell Robinson:I've made you I've made you the the second African or Jamaican the first Jamaican-American president there, kind of so one yes, that is it's very dangerous, but I think writing it that it we need to begin, we should have already a while ago, but we really need to begin writing amendments into our constitutions and across the world. So whatever constitution, especially like in the American constitution, we need to start writing amendments with technology into our constitutions, especially now with AI. If we don't write tech technological amendments into our constitution, then we're gonna have major problems. And that's it, that it possibly can create lots of wars. More wars. We already have enough, but even more wars and civil wars at that, because of that. So I think definitely writing amendments into the constitution because you know, in America we have freedom of speech, which is the first amendment, freedom to bear arms, I think, which is the I think the second amendment. And so you, you know, and in America, they're you know, they love guns. Um, so you know, which is like, okay, now you have AI and you're like, you know, you're carrying a gun and with the intent, what does that look like, right? And so, and you could be the most harmless person because there's people that carry guns just for their safety because they live in certain parts of the country. And so the AI could be very biased. And so writing those laws into place in the constitution about um bias and making sure that the technology code is written so that it won't be biased. Because a lot of the lot of the code being written is biased either way, right? And so making sure that by law they cannot write biases within the code, and if so, there's gonna be violations. So, and like especially like within the US code, they need to write it and then they need to make it law in the Constitution, which our Congress, our legislative branch, um basically approves the law. And so one of the things is um, you know, eventually, you know, our Congress is older. And when we get a younger Congress, um, which is probably gonna happen in the next few years, I believe that's gonna be one of the big topics. And I know I live in DC, I know that I'm probably gonna be going on Capitol Hill and talking with a lot of uh Congress um and Senate about um technology and making sure that um we really revamp our constitution. And I think uh starting a global effort to do that as as AI get more advanced, and but we need to move rather fast because AI is moving at lightning speed.
Stephen King:I think Lovell, I'm right in thinking the UAE has drones, the police force has drones already. And so when you have this kind of environment and you're plugging it into data and it's using whatever data is available, it's going to make decisions based on whatever data is historic, and it's not necessarily deliberate bias, it's just data bias that it is it is taken aboard, and I think that's really terrifying. Let's go to something more positive, Lovell.
Lovell Menezes:Cardinal, firstly, I I love the fact that you know you have a very positive outlook on you know wanting to implement technology and you know making these amendments, and that's why it takes us to our next question. You know, we would love to learn more about you know the Shaping Futures Foundation and how mentorship or community work connects to your professional mission.
Cordell Robinson:Oh, awesome. Yeah, so um I started Shaping Futures Foundation, which um is my foundation that I have. I have an orphanage and Arusha Tanzania. Um, and you can go to www.shapingf.org. And what I do is there's you I have young kids uh at the orphanage and not only just academic education, but we teach them life skills to include technology so that they can be prepared to compete. Globally. So it's so important to educate and train our youth. I mean, and these kids are between six and eight years old. So they're very young, but they're learning the most advanced technology, which is going to help them. And I want to, of course, expand and grow over time because it's so important to educate the youth. I think that beyond me just being in technology as a technologist and an executive, I want to make sure that I give back and I train, you know, the future generations. So it was really important to me to start the foundation and to really put education at the forefront. And I am working on in the next few years of actually building schools throughout Africa and United States that are going to be STEM-based and skill-based so that these kids learn these skills at a young age. And by the time that they're, you know, younger adults, they have all of these strong skills. They're able to either go out in the workforce or go to university, get to more get more advanced training, and they are, you know, going to be some of the top leaders in the world by, you know, starting that training at a young age. So I'm, you know, really um happy uh with the the work and all of the people that work with the foundation. It's uh been so rewarding, and I'm looking forward to uh you know many years of uh hard work and really changing lives.
Stephen King:That's amazing. I think we're gonna finish there on that very positive note. Uh and I really thank you so much. I'm really glad that we had this opportunity to speak. It's been two months or so in the making, and it really has really been worth it. I've I've really, really enjoyed speaking with you. I always enjoy talking about security and AI. Um, it's it's one of the most important topics. Uh, Label, would you like just to close us out?
Lovell Menezes:So, once again, thank you so much, Cordell, for you know taking your time out and sharing with us your wonderful insights. It's it's great to see that you're very passionate about you know the the cybersecurity space and and and you know wanting to ensure that you know everything, you know, privacy and security goes by hand in hand uh in this field. But you know, we l look forward to having more great conversations with you, and I'm sure your insights will be of great value to everyone listening to this podcast.
Cordell Robinson:Great. Thank you so much for having me. It's been such a pleasure, and you two are absolutely amazing. Thank you.
Aayushi Ramakrishnan
Co-host
Arjun Radeesh
Co-host
Imnah Varghese
Co-host
Lovell Menezes
Co-host
Radhika Mathur
Co-host
Stephen King
EditorPodcasts we love
Check out these other fine podcasts recommended by us, not an algorithm.
The Agentic Insider
Iridius.ai Inc
Brew Bytes
Nadeem Ibrahim