Profound

Profound - Dr Deming - S2 E3 - Chris Roberts - The Hacker Mentality

January 29, 2022 John Willis Season 2 Episode 3
Profound
Profound - Dr Deming - S2 E3 - Chris Roberts - The Hacker Mentality
Show Notes Transcript

Chris Roberts is my guest on this episode. I have had the privilege of getting to know him over the past few years. I find him fascinating to talk with. Chris describes himself as a ... Hacker, InfoSec, Safety, Cyberstuff Researcher, Advisor, @Hacknotcrime henchman, and various other names in the technical world. Chris serves as a vCISO or advisor for several entities worldwide. We discuss his background and how our industry may have lost its soul. This episode is definitely worth listening to. Chris can be found at the following places:

Linkedin: https://www.linkedin.com/in/sidragon1/

Twitter: @Sidragon1

John Willis  0:08  
Is John Willis Gan? It's the Deming profound podcast where we sort of explored everything that not every subject has to be Dr. Deming. I got like an amazing fascinating guests today. Like, I've gotten a meet a couple of times over the years and, and we probably thank Mark Miller for the original introduction to

Chris  0:26  
hopefully, Mark was good.

John Willis  0:29  
So kindly introduce yourself and tell me about your background.

Chris  0:34  
So Chris Roberts, you probably find me scattered all over bits of the internet. Let's face it, I'm not the hardest one to find these days anymore. Background oh gosh, let's see hacker and then military and then hacker and, and now you know, a little bit of everything virtual See, so we stuff consulting, the hillbilly hit squad. Also, whiskies watches and work. I think it kind of boils down to the it's like the wibbly wobbly web, but we've got the three W's doing other things these days.

John Willis  1:04  
That's my surprise question. Is lay versus face.

Chris  1:09  
Oh, my gosh, such are okay, so the heart The heart has to be Iowa and the has got to be Iowa. Definitely. I mean, it's, it's not just the hardest for taste. I mean, you've just got this raw, back to basics, gorgeous, amazing flavor profile coming, in other words, but then you go to the space. And I mean, you've got some amazing flavor profiles. Although honestly, probably a harder decision will be Campbelltown. So you know, go south down.

John Willis  1:42  
That's I'm not the you're the sort of reached my capacity of like, understanding. I can only fit like, you know, McAllen in a couple of wins. It was a cool liner. And yeah, it's one of my favorites. But I gotta admit, I'm probably on the wimpier side here. I'm just the Macallan 18 is my go to. So

Chris  2:05  
that's never a bad one. I mean, it's, it's, I think the fascinating thing about something like a MacAllan 18 is the ability to continue to do consistency. You know, it's unlike a Johnnie Walker, where you're you have variables that you can adjust. So this is the difference between a single barrel and a blend. Right? And, you know, at least with a blend, you can account for annual variances you if one's not so good. We can account whereas with an 18 You're stuck or whatever the hell comes out. Yeah. All right. Yeah.

John Willis  2:40  
So let's elephant in the room. It you know, I mean, one of the first times I met you, you were introduced as oh, by the way, have you ever heard this story? And like, holy crap, you know, like, so I'm like, You tell me whatever you want about that story. And you know, we don't have to keep it too long. So

Chris  2:58  
guy that broke the airlines. Oh, my gosh, I mean, that was I mean, in short, that was six years worth of research. We started it. And there's a number of us who work on it, like 2009 2010, we done a bunch of work on cars. And we figured out how to stop cars and start on the brake, all sorts of stuff using Bluetooth and networks and stuff like that. And we moved on to bigger things, airplanes, you know, kind of like you do. I was really looking at because I was flying every single frickin minute of the day, you sitting in front of this box of electronics riding on a box of electronics being fueled and managed by another box of electronics you like, wonder how this works? And that's kind of like that. That's the hacker mentality. It's how does it work? And what can I do? And so we started looking at it and realizing that how does it work was kind of cool and ridiculously interesting. And can we make it do things? And the short answer was, yes. And then we spent a while trying to trying to work with the industry didn't get too far. Took in a different direction, because I I will not be on my watch, I will not have an airplane going to the mountain because of something I could have or should have said. So that was the logic for it. And, you know, in some areas, it helped and in other areas, it absolutely blew up in my face. So you're still suffering some of the consequences, but lessons learned where you know, it's from a research standpoint, cover your ass is a little bit more effective. Unfortunately,

John Willis  4:29  
that's a good you know, and again, we'll let the sort of readers find out on the interwebs Yes,

Chris  4:34  
yeah, it's you know, it's tough as well because there's so much that I'm not allowed to say you know, I there. Everybody is listening. That's a train. I'm in the middle of Nebraska. I was actually I'm driving back from we just want to talk about, about some things to break into things. I was just gonna say locomotives are. Things break into,

John Willis  5:06  
um, this actually leads into something I really wanted to sort of hear from you. I, I saw a discussion you had recently on, on LinkedIn about like hackers and, and you know, I know Josh Corman for quite a while and you know, I sort of always admired some of the stuff he's done, you know. And so this idea that I think people, probably most people in it, or even DevOps is sort of aware of the difference between what you see on TV as a hacker and you know, what you sort of this idea of this meme of this evil person, and people who actually try to figure out things before the evil people do? Yeah, you explain that to the whole.

Chris  5:45  
Yeah, absolutely. And I think Josh, Josh is fantastic, as he's figured out a way to to manage the paperwork and red tape way more effectively. Now, I tend to be a bull in a china shop. Josh is extremely good and extremely eloquent at actually understanding how to get things done in a very different way. I mean, he's got a very political, civilized way, I just dropped a grenade and wander out. And I think so it comes down to, we all have, we all see how the world works. And that's everything from the humans in it the technology and the systems that have the infrastructure, we'll see how it works. And a lot of people are happy with that. They're like, happy, okay, it works. I don't really care how it works. It really does. It does Oh, stuff just works. It works. And there's some of us in the world who are like, okay, how can it work differently? What else is why does it work? What the heck is gravity? And what makes it work is no different looking at the computer going, I want to have those bits and bytes actually do what they should do. And so it's the hacker meant it's a mentality and a mindset more than anything else, which is really, how can I dig into this? What can I do? How can I understand it? And then if you're working with companies and organizations is how can I help them improve it? How can I help? So I love the dev SEC ops movement, so freaking much? Because that's collaborative, that's us working with development. So when they're coding the stuff we like, hey, think about this, because there are people like us who are not so good at the criminal elements who are going to take advantage of things. The hacker mentality is Look, can I can I figure it out? Can I help explain it? And can I can I make it better? Can I make it more effective, more efficient? Or can I help people understand it? So

John Willis  7:35  
it's, it's, you know,

Chris  7:37  
at the end of the day, it for good or for bad? The last words that anybody on this planet will ever hear if we ever find that big red button that shuts off the universe will be Hmm. I wonder? Yeah. And there'll be a hacker sitting there going, Hey, what about this stuff?

John Willis  7:53  
That's right. That's right. When that button exists, right, somebody? Yeah. I mean, I think the point you make, I mean, I go from like, these people are great people who are, you know, saving lives to the reality? Is it not to discount any of like, the sort of great things you do and other people just do? Is, I mean, that is part of it. But at the end of the day, you fly planes, right? And like, yeah, then you have the ability to figure out those are the mentality. And so you're gonna fly planes this much like, I want to figure this out before somebody else does.

Chris  8:22  
I think that's it's really, you know, it's it's this, this whole world that we're in is a chess set, right? If you think about it, and unfortunately, insecurity, you know, it InfoSec, security, whatever that you want to call it, we're not winning, I mean, against the adversary against the bad person, whoever wins, just not winning. And the more that we can question, everything that we see, and everything that's put in front of us, the better chance that we have of actually catching up and maybe maybe actually getting ahead of things for change. And the only way to do that, is that kind of hacker mentality,

John Willis  8:56  
stay on par. Right. Like I, I like, I think about like, you write the dev SEC ops thing, like whether you hate the word like we're and we can talk about something else I saw you talk about is we need better names for this stuff, right? But we can table that a little bit. But, um, but like, the the idea that, um, you know, I think when I'm giving a presentation, sometimes I'm in a big room, I'll ask how many people are, you know, work for banks, you know, and a large proportion room will raise their hand and I'll say, Well, how many of you like sort of get together and talk about security together? And all those things go down? And I think that's the beauty of that seconds I've had people in bank say, John, the biggest thing I want to thank you for is like I didn't see a way for me to sit down at the table would be a Goldman Sachs before you know some of this def SEC ops happened to have these kinds of discussions. And like that gives us strength right?

Chris  9:49  
Like it's it's frustrating that you've had to build up something which human nature should have which is collaboration, right? Right, you know, when you think about it that, you know, as technologists, we're not good at communication, collaboration, cooperation. It's human nature to some degree. To be good at it, but you've, you've had some SEC ops. And again, it kind of frustrates me that we're out that there's so many people argue over that. And I'm like, Look, if you really don't like the name, call it what it is, we're forcing you to actually go talk to other people.

John Willis  10:26  
I tell people, like, if you want to argue about the name meet, go ahead and group on the left side of the room, because I'll be on the right hand side of the room, trying to figure out that that brings me to I think what like worries me the most these days, I mean, really worries me the most right? Is your Josh Corman says like, like, given the famous Marc Andreessen software's eating the world, you know, Josh says, I think of it more about software infecting the world. And he's not saying that, like, he hates software, he doesn't, you know, has anything, it's open source or anything. It's just that the price of admission of what is happening with software, and I show you something about sort of lines of code and bugs and just enter that, your thoughts about all that. But then, like, I think that dependency map is just beyond human comprehension. And and I think that, like these, and you know, so I'll applaud anybody who tries any type of AI, like I'm sort of neutral on it. But I still think that this is there's an escape velocity on the mess of dependency. Yeah, that's probably worries me and we see it all the time, you know, the larger before. I mean, we got on the list, the list list list of how hammered anyway, I'll shut up. Oh,

Chris  11:44  
no, you're totally kind of lockstep with you in that when it's, I think we forget, to some degree, how much we did. I mean, I'm sitting literally, I'm sitting off to the side of the road. The train just went by? Well, that's all run by side now. Yes. Okay. It's valves and electricity and stuff. But I issue one command, the entire thing comes to grinding halt. And there's nothing you can do about it. I look at the lights here, they're all managed to control by systems that turisti pile on over here is going to be managed by, you know, an industrial control system, the buildings here. I mean, we, as technologists, literally now hold the keys to the kingdom, for good or for bad. You know, we've had that for a long time industrial revolution. But that but you go forward now. And at this point in time, what one individual can do with a keyboard and a determined attitude, is literally potentially wreaked havoc on a lot of things. And I think we forget that we do forget it, because to your point, I put out the shoe years ago, globally, it was what 93 or 97 billion lines of Cody, you can define what a Hellinikon is business sheets and let it go, no matter which way you look at it. And within that, if you take even a low average, you're looking at, say 10, or 15 errors, or bugs per 1000 lines of code, go the go the really, really good error and just even say, Sony, two or three, you are looking at millions and millions and millions of errors of which we haven't found, or we haven't identified, then you couple that with the 30,000, the 30,000, you know, different variants of malware and bugs that we see on a daily basis. And it's it's not pretty.

John Willis  13:32  
And then you throw in the sort of, I mean, that's just a sort of a linear look at it right? Then throw in the complexity of all the independence or the interdependency of forces. Yeah, yeah, it's, it is.

Chris  13:44  
It's, I think, the other other challenge that I have, which is why again, I've got a soft spot for anybody that says, hey, let's work together. Right? Is, we as an industry are still arguably in our infancy. You know, yes, as I mean, I've been kicking around this industry, I've been messing with computers for nigh on 40 years and a little bit. Even that pales into what we've got coming down the line. So we've taken it from an industry where you and I knew everything that we were working, that we had our hands on everything, we had a really good idea as to everything that was going on. We've expanded it, and we've segmented it and we've we've sliced and diced it to such a point where no one person has the answers. No one group has the answers. And yet we still act like a bunch of damn school kids at times. And yet we have the keys to the kingdom. In our hands. You know, we we now have software and systems that act upon a human that make life decisions that make life choices that interact with humans. So this isn't a case of oh, it's screwed up. Let's just reboot the computer. This is a case of we screwed up. We have life in our hands and I don't think we are at a maturity level. As an industry to actually accept that responsibility.

John Willis  15:05  
Yeah, no, it's scary. I mean, I don't you know, like, you know, every once in a while, I'll sit in on a bar, you know, to some of these financial think tanks that I get invited to. And there'll be a couple of autonomous vehicle people. And I'll just sort of get in, you know, and you hear these like, ridiculously crazy stories about sort of learning algorithm. I've heard one story was just fascinating it you know, after this car had been prototype that all they found that there was sort of a significant deviance in the car making left turns versus right turns. Yeah. Right. And so they literally brought in like, you know, the think tanks, the MIT's nobody can figure it's like, those algorithms are like, they're not like, you can look at code and figure out like,

Chris  15:47  
Well, yeah, no, this is a learning architecture that's on the fly. Yeah. And what they found

John Willis  15:50  
is the prototype testing is that it would always come back, sort of making a lot of sharp left's, again that a day. And there was these yellow signs and light. And it literally learned so to differently on left turns, like, light significant enough to like, probably not get approved, you know? Yeah. You know,

Chris  16:13  
if there was another instance, there was another instance, where the learning algorithm again, vehicle autonomous vehicle, and again, we're putting our, we're putting our trust in a vehicle traveling at 6070 8090 100 miles an hour and 100 million lines of code. And yet, in the original algorithms, they don't they used literally white Anglo Saxon males for most of the testing. So this this damn vehicle, just traveling down the road, couldn't recognize somebody who wasn't white effectively. And I'm like, crazy. No good. No good people. This is this is we are missing so much. And I think that's, we've done so much. I mean, it's this industry is amazing. And I love it, but it's, it's got a long way to go. And yeah, you

John Willis  17:02  
know, and then the thing is, and how do we balance what we do what you know, sort of, you know, it's like I saw your you know, I love this way I think we get along so I see your stuff. I'm like, oh my god, like he's, he's saying it better than I would say and I'm like, totally. But But Dillard, you made a comment about like, I It's driving me crazy. And I don't I keep telling myself to sort of write something like a blurb and then say, Yeah, you know what, it'll sound so like I think I'm really ridiculously smart if I say it, but like everybody who now claims that like if you use them they would have solved the large project mess. You know, they go Oh, no, sorry, not done. It's like you know, like, oh, this you know, I guess it goes back to that so this idea that like this is a mess. And yeah, you know, I give everybody their sort of pat on the back for trying hard. But I wonder like, are we losing Are we okay,

Chris  17:59  
we've lost our soul. Here's what we've done in his I think this is where you know, I look at I look at other people in the industry as well I've got some love for them. I mean, like Evan I look at you I look at Mark and I look at other people and there's the the concept whether you call it mission before money whether you call it people for profit our job if you really think about security and safety Our job is to do the best to look after our fellow fellows you know, it's it's to do that but we haven't done that our job unfortunately has become how many billionaires can we meant how quickly can we get the company to go to IPO and public and quickly can we exit and do the next thing we've we've lost the plot we no longer we no longer do security for the for the better of those around us we do it to profit off and now I'm not saying profiting is evil. I'm not saying it's bad. But I think there's elements of it we all get to put food on the table. You know, I'm sitting in a very nice car and driving a lot bla bla bla bla bla Yes. But I think he gets to a point where it's too much. And security's gone there. And I think the other thing is so a perfect example. You come up with a new idea for a widget that will actually fix it. Well, rather than me going well, that's cool. I'm gonna do another company. Why don't I just knock on the door on? Hey, John, I love what you've done. I've got some ideas. Can we collaborate? Could we work together? Can I help you? We don't do that I bought from another company. And now the whole consumer has got two different people going I can solve all your problems.

John Willis  19:37  
No, no, I mean, I'm not to say I love like soda J frog tonight. They've been like credibly well to me and, and I think Shlomi the CEO is probably one of the better startup CEOs but but you know, I mean, the reality is like, they started out and I you know, I think they're still really good at all this stuff. So this is not a negative but to me that like you mentioned In that we lost the plot. Like I look at some people who were creating this incredible stuff early on that was open and collaborative. You know, the VCs, I saw this a Docker, you know, like, you know, I start Solomon, in my opinion get tainted by the, you know, I think Solomon left to his own devices would have been, you know, and again, not a negative against him, but he probably the his sort of worldview was, you know, save the world make everything better, let's, let's, you know, containers will make things, but like, they get a hold of you, and they get your hooks in you. And like, you know, you see these paths to billions. So you know, and, you know, certainly hundreds of millions. And it just changes the plot, as you said, you know, when you IPO, as a security company, oh, you're trying to get a bunch of people in a room to try to help figure out a hard problem.

Chris  20:49  
And I think that's it, you know, it's where we, no longer are we going out there. I mean, this is, you know, again, I'm sitting in the middle of Nebraska, I'm doing a bunch with a company called flyover futures, because very few people understand the Midwest understand exactly what it takes to look after and secure and safeguard the Midwest, very few people have figured out or tend to care about, you know, the 123 1020, mom and pop shops, companies have been going 20 years, and there's 10 people and they're getting their asses handed them on a daily basis by attack those who are just easily I just came out of doing an assessment and amazing company loves helping them. But then nice, that too nice. They don't have the cynical streak that we have. So I walk in, I'm like, Hey, I'm Chris. And then five minutes later, I'm in the damn datacenter. Half an hour later, I've owned absolutely everything. And there's passwords all over the place. I'm like, Okay, if you're a cynical, what's it you had a taste of me before I walked in? But you're not because you're nice in the Midwest? Yeah. And so it's like, how do I help? How do I help them? How do I help them ask more questions? How do we help them or not? How can I take advantage of them? How do I help them to get better and be more effective? So yeah, it's, we've lost that. We don't protect anymore. We try to profit. And I think they're frustrating.

John Willis  22:13  
Yeah. And I think that's the you know, can I mean, again, I've done startups and I, you know, like, you watch even yourself, you sort of start, you know, you have a good idea and, like, the sort of gravity just pulls you there. Yeah. Well, I sell it or, or even, like, how can I be successful? I'm gonna need money. Right? Like,

Chris  22:32  
yeah, that's, that's, I mean, we go through that with the Dave product at the moment, as well as like, you know, how do we how do we get the sound to be more effective? And it is, it's a terrible dichotomy, because it's like, Look, I need if I'm working to be able to pay to develop Dave, and there's only so many cycles you can do of that. Do I take money? And if I do, we're looking at potentially crowdfunding or crowdsourcing because at least there's less, there's less ties, there's less pressure, there's less I don't have a bunch of VCs running. Dammit. Same thing. I look at the other post deal. I realized there's also hope because I look at like Evan Frankel and and Ryan close. Yeah. Did you fr secure and security studio, I look at them. And I'm like, they figured it out. I mean, they're running a multi million dollar company. But they're doing it right. So I know, it can be done. It just takes more effort. I mean, it does take more effort, but it can be done.

John Willis  23:21  
Yeah, I think could you just elaborate on with the QSR? Like, what they're doing?

Chris  23:26  
Yeah. So Fr. Secure again. Yeah. So those, so those and then services, security systems assessments, do a lot of stuff like local state. Gov. Again, one of those groups that a lot of companies tend to shy away from because it takes more effort. And then studios doing a bunch of there actually, there's some really cool stuff on how to actually how to talk about security in a way that everybody understands metrics, and all the good stuff isn't a part of the

John Willis  23:53  
problem that people just don't want to know. Right? Like, I mean, like, I was on a call with some, some company that does software for some for the oil industry. And I was trying to have a conversation about dependencies and just understanding your sort of software supply chain a little bit better, you know, and just like he's like, why do I need to know that? Like, I just sell software to, like, I'm sitting there going, like, okay, probably should just stop wasting my oxygen but, but like, I think of myself, like, Dude, you were there's such a, like, you'd like my first question is, is there any open source in your technology actually no BS, right? And then like, oh, like, do you understand the liabilities that if somebody dies, and like, but they but like, but they was they was so adamant, like, I don't need you know, like, I don't know why I would need this. And I think too many people either just don't want to know. Yeah, and you know, and so it's hard for people like sort of also you certainly to say you know, like these mom and pops like let me come in like we don't we don't you Yeah, high level security guy like Christian. Yeah, we're just a normal part. But like,

Chris  25:06  
I think everybody needs you know, that's where I've got a little bit of love for some of the fractal stuff like you do the virtual CISO stuff. It's like, Hey, you don't need a homie number one would buy the expensive, right? It's really it's just the guiding it's like how do I, how do I synthesize? How do we synthesize the knowledge a lot of us have in such a way that he could effectively help more people? Because there's any you know, there's any one of you, there's only one of me, and there's only so many people we can talk to. It's how do we do that more effectively? And I don't want to I mean, you want to do it, not you. So it's defensive back to the mom and pop shop? It's like, how do I do it? So it's cost effective? How do I do it? So I can so they'll actually learn something from it. Or, I mean, there's the other alternative, which is you throw you throw a crystal ball, 510 15 years in the future, there is a part of me, I hate to say it. I don't like humans that much. I just don't, yeah, coming out of the military, some of the some of the craziest shit I had to deal with, did not give me a taste for that liking many humans, and we're terrible to the planet. We're terrible to each other. We're terrible to ourselves. At some point in time, there is a part of me that wants to make the argument not just that the machines take over. How bad could it be? No,

John Willis  26:24  
I think about that myself too, as I get older, and I'm more cynic. You know, when I think about it, then I just go back to you know, we love the people we love. And then we deal with the people we don't like and you're right, like I can I often tell my wife does, like I think I like humans, you know, but then that excluding the like, the the humans that I have, yeah, that's important. But you told me a story. One time I was just thinking about this, that I loved it like that you go into a company, you sort of alluded to this earlier, like, and you sort of almost pier on a security thing. So you'd have these people doing work. Meanwhile, you'd be like, three steps ahead of him, like creating these disruptions and just sort of watch the flow. I just thought that was an incredible learning opportunity. Something like that, you know,

Chris  27:10  
yeah, I have a ton of fun doing that. Because for whatever reason, the brain is wired in such a way that I that I enjoy Jenna, we just did it here doing some assessment stuff. But then I'm fortunate I get to wander in and out of some interesting thing tanks where the hell we were doing one now? I get tried to do a bunch of work every now and again with some folks in Special Forces teams. And I hate the fact that we keep putting, we keep putting humans in harm's way. It's like, how do we become more effective at it? How do we do it easier? How to do simple, faster? How do we go, oops, you didn't get everything, we have to send you back in again. Because that's just never fun for anybody. So started coming up with a bunch of tech solutions, again, using the human body. And we started to come my days about how do we store data inside the human? How do we move data? How do we use the human body almost as a almost as an antenna, shall we say for pulling data and moving data? So doing a bunch of fun stuff on that realm?

John Willis  28:10  
What that's what I was like. So we've had a couple of conversations over the we need to make sure we have more. But the last time I talked to you, I like sort of ending with you know what's interesting now and you're sort of it's usually pretty last time you told me it was basically trying to use sort of mental brainwaves. Yeah, implement passwords. So yeah. And then like it sounds like you've taken that a little further now based on

Chris  28:36  
Oh, yeah, yeah, that's that's gone on a few leaps and bands, I'm not quite at the stage where I've got Hollywood beat with with Iron Men and his voices. But I'm working on it, the computer now knows me way more effectively than they used to. I've actually got it so it knows what I'm up to what I'm doing ahead of when I'm doing it in those. Yeah, I've got it in a more active mode. So it's actually really, really cool. But then I'm also messing around to other areas I'm messing around and one of them is on the molecular side with the human. And the other one we were dealing with communication I I'm really happy about where we are, but it really frustrates the hell out of me because I want to hang around for another 50 100 200 years, just for shits and giggles. But I'm looking at so one of the fun things in the quantum world that's obviously getting more and more explored is the communication capabilities. So we start you know, we've reached almost a maximum of where we can get to by using lights, frequencies, colors, etc, etc. So now if you break light down, well, it's either a molecule or a wave. So if I give it a molecular stance, I split that molecule and I then move one half in one place and one half in another place. I can instant taint literally instantaneously effect change on one end or the other end. So it's entanglement principles at the molecular level. So If I put one half of it, the other half goes, ouch. And it doesn't matter where it is in universe. That's where it gets really crazy. Because all of a sudden, this whole concept of Lightspeed and everything else just gets kicked out of it gets kicked into touching, like, different branch of mathematics. This is fun. Now, if you start thinking about it, if I can do that, I can scale down technology to a point where I can literally communicate using molecular based technologies I don't have to have is racks and racks of crap. Secondly, I've got much more effective, much more efficient, much more secure, much easier. And all these things much more robust methods of communication. So do some fun stuff on that. And it's just fascinating. And it's one of those things where I'm like, How the hell do I stay? How the hell do I keep doing this for as long as I humanly possibly got? I don't know how long this gray matters.

John Willis  30:51  
Maybe you can fix that too. Right? But, uh, so let me just really dumb it down in a, in a sort of it smart ways. Is it sort of like, you know, Shannon information theory at the molecular level? Yeah, pretty much.

Chris  31:05  
Yeah. It's frickin awesome. That's yeah, it's totally awesome. I mean, it's it's not quite a jam jar of molecules, but it feels like it's almost that it literally isn't going to yelling into one jam journal at the other end of the damn thing. The damn things yelling back at you. It's that's crazy, man. Yeah, it's, it's just fantastic stuff. And it's, there's a couple of amazing books out there of them. One of them is like, you know, one of them really is, is he explains how much we don't know. I mean, you look at molecules, we've identified, you know, 12 different molecular structures, we and use three of them. We have no frickin clue what the other nine are meant that, like, well, we know whether they're light dark matter. It's kind of there. And there's more shit out there than there is on dark. We just don't know what the hell to do with it. Yeah. And I love that,

John Willis  31:51  
like proteins, right? Like, the difference between sort of the DNA. Yeah. But I saw I got a question that may annoy you but like, and I'm sure everybody asks you this question. And so forgive me up front and know what you thought about quantum.

Chris  32:05  
Oh, I knew that was gonna get much cool stuff. So much cool stuff. I you know, there's been a breakthrough recently in the last space in the last month. That's actually maybe more relevant. So one of the biggest problems with it with a quantum computing is basically error correction for the better way looking. Yeah, ones, twos and fours. Not too bad. You start scaling the standard thing, your error correction, just close.

John Willis  32:33  
On one of those boxes that I was interested in, that sort of explained to me was like, it's like going back 50 years and where they have the wire, like when it's actually done, you'd have to have sort of timed wire. Like that's where we are today in quantum Yeah,

Chris  32:46  
I mean, it's, it's about one step away from like, Bletchley Park and eating mishmi knee and that sort of it. It really isn't too far from that in some ways. But the difference we have now this is where it gets really interesting with technology loops. Because if you think about it, what you go back to turning, and you go back to that time frame, the time it took to get from Vat to like the diode to everything else was crazy. I mean, you crazy amount of time. But then we came around to the microchips, and we shorten that tech loop down considerably. We're now at the quantum leveling, we've shown the crazy because we first of all, okay, we can use these to do this and, and then you've got to perform like shit, we got too much correction. rapidly, like within 12 to 18 months, we figured out a shitload of the error correction methodologies. So I'm that that cycle for how quickly we're evolving, the tech is shortening down ridiculous. Yeah, I'm let me rephrase this, I am ridiculously happy with the technology and where I see it going. I'm ridiculously amazed at just some fundamental principles of what we're about to put mathematics through and security computing. But the first person in marketing that tells me they've got quantum security,

John Willis  33:58  
you're going to punch in the nose.

Chris  34:00  
Oh my god, I'm going to punch in the nose. I'm going to take outside I'm going to strap down onto the grass and I'm going to Taser them until I'm happy which is going to be several cartridges.

John Willis  34:10  
That's awesome. Yeah, I mean, that's the interesting part. Right? Which is, you know, that sort of quiet title again, I just like these, this guy that lurks on the edges of all these topics, but like quantum supremacy and in like, what happened when we when somebody breaks the sort of RSA encryption like that's that's a race right? That's this

Chris  34:29  
But see, this is the problem. This is where it gets really interesting because we each of us gets a bigger gun. So this is where I love it because it's like, okay, you broke you broke my RSA. Well guess what I got my own quantum engine that actually re entangled back crypto and that cipher into a different one good luck Have a nice day. And I think that's what everybody misses I mean that there was a frickin startup was like, well, we can we can, we can protect you. We can protect you at the quantum level. I'm like, yeah, yeah, I don't quite know what that means. The I think it's, it's again, this goes back to it's a game of chess. This goes back to you know, when you and I first started this was, I mean good grief alive, you know, it was modems and pre modems that were breaking. And now we're talking about breaking quantum architectures. And every single step that the adversary makes, guess what, on the positive side, we get to step up as well, or whoever was first. So, you know, I'll break on, you know, we'll break RSA encryption, we'll break it all. But guess what we'll we'll end up with quantum encryption, and anything that needs to will get run through the algorithm that says, This is where you were, this is where you're going to be. And off we go from there. It's an arms race.

John Willis  35:41  
Yeah. I mean, this is my problem with doing podcasts with guys like this, like, like, this could never end. But, you know, one of the things on the quantum thing that I like, that sort of popped up recently, is this idea that these adversaries are stealing data now. In that they can sort of

Chris  35:57  
Oh, absolutely, yeah, I mean, good grief alive. I mean, there are caches. I mean, yeah, especially nation states. I mean, you know, adversaries along our lines. Absolutely. But yeah, there are I mean, good grief. I, you know, I would hate to know how I wouldn't I mean, we know how much data is out there, you know how much of that data is literally just waiting in states, although, you know, again, at the end of the day, there's easier ways to get the data, we always talk I mean, encryption, you know, it can't break my encryption. That's it. Well, come sit down with me. I got a I got a span up, I'm gonna break your fingers, or I'm gonna go get your kid and break your kids fingers until they give me the key, huh, encryption broke the next question.

John Willis  36:40  
Yeah, no, that's, I mean,

Chris  36:42  
there's always a way And arguably, it always comes back to the human. You know, this is the technology race. I mean, technology's outpaced our ability to understand it, no two ways about it, we do invest. But at the end of the day, it's the human we got to come, we got to bring it back to the human. Part of the reason for the desktops is it's it makes you think about collaboration, communication, and corporations what it should be, it's all about the human.

John Willis  37:09  
Yeah, it is, you know, I mean, you know, again, a lot of what I think about is a shift away from sort of determinism as the human being the controller to the human being the actor, but like, if we go too far, we sort of diminished the humans role as a real, like in an airplane, right? That's a great example is, we clearly have moved away from the pilot being 100% role playing an AI pilot being an actor in a set of actors. Yeah, never. We, you know, we find out over and over, if we lose the human element, of those we find

Chris  37:49  
this trust in I mean, this is a statistic I remember from several years ago, and somebody would have to double check it. But there's a statistic somewhere that says, you know, planes would be a lot safer if we didn't have this because human error. I mean, arguably, that gets all the color. I don't really want the computer running it especially with the fun and games. We happen with 5g belly flopping a plane because it got his altitude messed up. Yeah, probably something you want to be in the middle of. But yeah, it's some. This is again, it's not given the fit of humans. If you take the human out of the equation. Do you end up with a better architecture or not? It's at that's going to be an interesting question that we are going to have to really seriously debate.

John Willis  38:31  
Yeah, no, I just wrote about the Solly thing, right? That's a great reflection. Right that? Yeah. Like if that was all computer on, it probably would have tried to go to Teterboro. And it would have crashed. Yeah, it would have been a mess. Yeah, so it was no, no. Like, you're anyway. Yeah, we could spend

Chris  38:48  
no, you're right. We, I mean, vehicles, the vehicle, I mean, the you've got the classic dichotomy of the vehicle. And if you look at that computer, who's that? Where's the loyalty? Is the loyalty for customers inside, inside that passenger cell? Or is the loyalty for the person on the street, when that computer's going to make a decision as to who dies? Because, you know, at that point, what is the right decision? I don't know. At what point do we put the computer on trial? Right.

John Willis  39:16  
And that opens up a whole nother. For liability for somebody who wrote code. I've asked that question. And people tell me I'm crazy. But like, I don't know. Yeah. I wonder today when there's a lawsuit. To try to backtrack to a Google software developer who wrote code, they made a decision of whether a baby or grandmother should die, right? We haven't

Chris  39:37  
now we've had it now. In in the security space, um, was the NRO group who was the group over in? Oh, gosh, it was one of the groups that was attacking all of the mobile and cellular systems against somebody to Google this one. They went after the individuals. And I'm like, and then they went after the exploiters divisions we see in the research community. You know, if I go knock on your door and I say I found a problem with your system, I'm half the time I'm going to be met with lawyers. I won't be met with a welcome mat gun. Hey, thank you. I'll be met.

John Willis  40:12  
You know that firsthand, right?

Chris  40:16  
But I think you're right. I really, I don't see it being very, very long. I mean, you look at insulin pumps, you look at the voting system, you look at any railroad crossing any of those things where human life is at risk. The first time that it screws up and a lawyer gets a sniff of, hey, I can go off the you go off to the developer, but then the developer use six or seven different open source libraries. They never sent us what Who the hell did laughter Well, they did it. We did they Yeah. It's not fun. And it's going to happen more and more often. I want to say, yeah, maybe

John Willis  40:52  
they go after the log for JK. Sir, I'm just kidding.

Chris  41:00  
Hang on, I built this for this. And

John Willis  41:04  
yeah, you know, I was joking. Oh, my friend. This is brilliant. I knew it was gonna be awesome. And like yours. Awesome. And, and then, you know, just in general, I should, I should make more of an attempt just to catch up for you. And

Chris  41:18  
we'll catch up when uh, you know, the deal. I mean, we're all running around like,

John Willis  41:23  
crazy, you know, this,

Chris  41:24  
this works. This is awesome. And at some point in time, when we, when we emerge from the craziness or whatever the new version of us is going to be then I'll bring the whiskey case.

John Willis  41:34  
You know, I'll bring the algorithm account you bring the Carolina and we'll just send

Chris  41:39  
I love that. I did.

John Willis  41:42  
Take care of my friend.

Chris  41:43  
You too, brother. You take care